fbpx
01 / 05
The Democratization of Investment | Podcast Highlights

Blog Post | Financial Market Development

The Democratization of Investment | Podcast Highlights

Chelsea Follett interviews Jennifer Schulp about how technology and regulation are shaping the future of investment.

Listen to the podcast or read the full transcript here.

Tell me about some hopeful trends or progress we are seeing in the financial industry.

One of the most hopeful trends in the financial industry is broader access to financial investment. Traditionally, investment in the stock market has been limited to the wealthy. Investing in the stock market is really important because, over the past decades, the S&P 500 has returned approximately 8 percent per year, which is way more than other non-equity investments.

Financial access has improved tremendously over the last 50 years. In the mid-70s, to make a stock trade, you had to call your broker on the phone and tell them what you wanted to trade, and they would charge you something like $50. So, you didn’t want to place a trade unless you were placing a large trade because otherwise, the fee would overwhelm the trade. And you didn’t want to trade very often. All of it made it very difficult for regular people to invest in the stock market. Over the course of decades, those fees came down as there was additional competition brought into the brokerage space.

In the 1990s, we saw the rise of internet trading, which allowed you to place trades on your own. In 2015, Robinhood started offering no-commission trading on a phone app, which allows people to trade regularly without worrying about fees eating into their profits or adding to their losses. People can now take some money from each paycheck and put it in the stock market. That’s been huge. The entire brokerage industry is now moving towards phone access for easy, cheap trading, and that’s made a huge difference in the number and type of people accessing investment in the stock market.

In 2020, during the pandemic, we saw a massive rise in retail trading that many wrote off to people being bored while they were stuck in their homes. However, a lot of those investors have remained in the market, so what might have started as a pandemic-induced interest in the stock market has become part of a long-term trend towards additional retail trading that has brought in more racial minorities, more low-income people, and more young people.

Easy and cheap trading has also allowed people to experiment with the stock market and learn by doing. There was a study that came out not too long ago by FINRA and NORC at the University of Chicago that looked at the investors who opened accounts in 2020. And they found that those who stayed in the market showed an increase in their financial literacy. Having this access helped them allocate their capital better. So, we have more people invested in the larger economy, and they are getting smarter about it. The benefits will compound over time.

What are some of those potential benefits?

Certainly better personal financial outcomes. Of course, some people are going to make poor decisions. You can’t say, “Because you put money in the market, you’ll be better off.” But for people looking for long-term investment options, the stock market is the greatest wealth generator we’ve ever seen.

I think this could also drive economic growth for a couple of reasons. One, investment gives people a stake in society and the economy, and that itself can drive growth. Two, having retail investors put money that might otherwise be under the mattress or in a low-interest savings account into businesses allows those businesses to flourish.

Are there any benefits for those who are trying to start businesses?

That brings up a new set of questions. What we’ve been talking about so far has been retail investment in public equities markets. But the stock market doesn’t generally provide startup capital. You have to be a mature company to want to bring an initial public offering that gets you listed on the stock exchange. Private market investing is where startup investing happens. And in the United States, far more money is raised in private markets than in public markets. The average person is not allowed to partake in private investment in the United States, as well as in most economies across the world. In the US, you need to be what’s known as an accredited investor, which essentially means you make more than $200,000 a year or you have a net worth of over a million dollars.

This is a very arbitrary standard. You could win the lottery tomorrow and suddenly become an accredited investor, and that doesn’t make you any smarter at investing than you were the day before. It doesn’t make you any more of a capable investor than someone who, say, studied startup investing in their MBA program but isn’t yet making enough money to be allowed to invest themselves. And all of this is a problem because it means the government is standing in the investor’s shoes and making decisions for them. Are they smart enough? Are they rich enough? Is this a good idea for them?

Let’s talk about entrepreneurs, as you asked. People trying to start businesses tend to turn to their community. They tend to raise money from the people that they know best. But if you are a minority or live in a rural or low-income area, you likely don’t know many people who meet that accredited investor standard. You’re already at a disadvantage in raising money and getting your business off the ground. That hurts entrepreneurs in less wealthy communities, the economy as a whole, and potential investors who don’t have the opportunity to share in the growth of that business.

The house recently passed three bills looking to reform the accredited investor definition; two have codified an SEC modification to the rule allowing people who have passed certain securities tests, such as brokers or investment advisors, to qualify as accredited investors, even if they’re not wealthy enough. The third bill is a bit broader; it opens up the testing concept to allow, if passed by the Senate and signed by the President, anyone who passes a test to be able to invest as an accredited investor. There will be costs associated with the testing, and it doesn’t get at the underlying paternalism, but it is a step in the right direction.

Could you talk about ESG?

ESG is actually two distinct concepts, and it’s important to identify which one we’re talking about. It can be broken down into a dichotomy that I’ve borrowed, which is value versus values investing.

“Value investing” in the form of ESG just refers to using environmental, social, or governance factors to analyze whether a company faces risks that might affect its financial performance. Where ESG sounds a little bit different is when we think about it as “values investing.” That kind of ESG is about sacrificing financial return to reach a certain outcome with your investment, like lowering carbon emissions. Of course, investors should be free to invest their money as they see fit. If they want to invest in saving the whales, they should have that opportunity. But it gets trickier when a company or asset management firm makes those decisions about what to do with their investors’ money without being upfront with them. That’s a question of disclosure and whether or not the funds are being clear with investors.

Government mandates are the key place to focus on here because, ultimately, the market should decide whether investing in ESG is the right way to go. Europe has decided, writ large, that the way to tackle climate change is to centrally plan how money will flow through the financial system to choke off funds for non-green investment. Supporting that is a host of European directives on sustainable finance that include a lot of disclosure by companies about how they, too, will meet net-zero goals. Europe has what we in the securities industry refer to as a “double materiality standard,” where European companies are not only supposed to disclose information that might impact the company’s financial performance but also how their company impacts society and the environment. All of this comes with pretty heavy costs.

The United States is now considering how far to follow Europe down that line. The Securities and Exchange Commission (SEC) has proposed a sweeping climate risk disclosure framework. It’s different from the European framework in that the SEC at least recognizes that they don’t get double materiality; the SEC is only allowed to require companies to disclose information that investors might find useful in deciding whether to invest in the company. However, the SEC’s climate risk disclosure rule goes well beyond that. It would require all US public companies to disclose an awful lot of information about climate risk, including scope one, scope two, and, for many companies, scope three, greenhouse gas emissions. What’s important here is that this type of disclosure is not a small undertaking. It’s going to be a massive drag on public companies.

You also oppose government rules that would restrict voluntary ESG-related disclosures. Can you tell me about that?

Sure. There’s been some legislation introduced, some of it passed, from state-level Republican legislatures that prohibits the use of ESG in investment. But this broad prohibition is also not the right answer. In fact, it is itself values-based and seeks to impose an ideology onto investing.

In addition, there are real costs to blanket prohibitions of ESG. One is that ESG as value investing can sometimes yield better returns. Pensions in some states that have introduced legislation to prohibit the consideration of ESG factors have released analyses showing that over the course of 10 years, the pensions might be losing billions of dollars in returns by having their investment pool artificially limited.

Another example is Texas, which prohibits localities from doing business with financial firms that are, quote, “boycotting the fossil fuel industry.” A study done not too long ago showed that the cost of municipal borrowing has gone up in Texas because many firms exited the market, meaning taxpayers in Texas are now paying more for municipal building projects. We shouldn’t forget that narrowing the scope of investment opportunities also narrows the opportunities for growth.

Could you speak about the potential impact of AI on investment and the financial industry?

Many people don’t understand how much AI is already part of the investment industry. For example, AI is already involved with investment research, predicting stock value, and portfolio management. That’s all going on behind the scenes.

I think that there’s real potential with respect to financial advice. AI could make investment advice as accessible as trading on your phone is today. For a long time, we’ve had what are known as robo-advisors, which are essentially chatbots with a narrow tree of advice based on a set of questions. More sophisticated large language models could give individualized investment advice that considers all sorts of circumstances at a very low cost. In the future, you may be able to go on your computer or phone and tell the LLM, here’s what my investments look like; what should I do next? That’s powerful stuff, assuming that the regulators allow something like that to happen.

E&E News | Energy Production

BLM Approves Geothermal Project, Moves to Ease Permitting

“The Bureau of Land Management issued a decision record approving the Cape Geothermal Power Project in southwest Utah, which would have the capacity if fully built to generate 2,000 megawatts of electricity, which is enough to power about 2 million homes.

The Interior Department also said it is proposing a new categorical exclusion that would streamline the process to evaluate and approve ‘geothermal resource confirmation operations’ of up to 20 acres. These could include drilling wells that would be used to to confirm the existence of a geothermal resource, the agency said.

The goal is to ‘accelerate the discovery of new geothermal resources throughout the West,’ and particularly in Nevada, which the agency says is ‘home to some of the largest undeveloped geothermal potential in the country.'”

From E&E News.

Axios | Air Transport

Feds OK Rules for US To Begin Electric Air Taxi Service

“The Federal Aviation Administration on Tuesday Issued Long-Awaited Rules That Will Help Pave the Way for the Commercialization of Electric Air Taxis as Soon as Next Year…

Driving the News: FAA Administrator Mike Whittaker Announced the Final Regulation During a Speech at a Business Aviation Convention in Las Vegas.

  • It Includes Qualifications and Training Requirements for Pilots of These New Aircraft Which Have Characteristics of Both Airplanes and Helicopters.
  • The Rule Also Addresses Operational Requirements, Including Minimum Safe Altitudes and Required Visibility.
  • The Rule Is ‘The Final Piece in the Puzzle’ for Safely Introducing These New Aircraft to the u.s. Airspace, He Said.’”

From Axios.

Blog Post | Communications

Digital Technology and the Regulatory State | Podcast Highlights

Chelsea Follett interviews Jennifer Huddleston about the benefits of digital technologies as well as how we should think about the risks and problems they pose.

Read the full transcript or listen to the podcast here.

We hear so much about the risks and downsides of technology. What are some areas where you believe digital technologies have improved our lives?

There are so many areas that we’ve seen transformed by technology over the last decade. Think about when we were faced with the COVID-19 pandemic, and so much of our lives shifted to our homes. Now imagine if that same thing had happened in 2010. How different would that have been? How much more limited would the options have been to stay connected to friends and family, entertain yourself at home, and continue your education and job?

Because the US has maintained a light-touch regulatory approach to the technology sector, we empowered entrepreneurs to create products that benefit consumers, sometimes in ways that we never could have imagined. I still remember the days when you had to have atlases in your car. And I remember when MapQuest seemed like such a huge deal. Now, if you’re going somewhere new, you often don’t even look it up in advance.

I’m hearing a lot of calls for more regulation of digital technologies. President Biden is saying we need to clamp down on AI, while Nikki Haley has said we must deanonymize social media. What are some of the dangers of over-regulating these technologies?

I’m going to start by asking you a question. How often do you think you use AI?

When it comes to ChatGPT, every few days. But I’m sure that what you’re hinting at is that AI is incorporated into far more than we’re even aware of.

Exactly. Most of us have been using AI for much longer than we realize. Search engines and navigation apps use AI. If you’ve ever tried to do a return and interacted with a chatbot, some of that is possible because of advances in AI. We’ve also benefited from AI in indirect ways. For example, AI can be used to help predict forest fires and to assist in medical research. Because AI is such a general-purpose technology, a lot of the calls for regulation may lead to fewer of those beneficial applications and could even make it harder to use many of the applications we’re already used to.

Oftentimes, people just don’t think about the consequences of regulation. When we think about an issue like anonymous speech, many people immediately jump to their negative experiences with anonymous trolls online. But we should also think about the costs of deanonymizing speech. Think about dissidents trying to communicate with journalists or people trying to alert each other to social problems in authoritarian regimes. Anonymous speech is incredibly valuable to those people, and we have a long-standing tradition of protecting that kind of speech in the US. When we look at creating backdoors or deanonymizing things, that’s not just going to be used for going after the bad guys. It’s also going to be exploited by a whole range of bad actors.

And this country was arguably founded on a tradition of pseudonymous and anonymous speech; think of the Federalist Papers.

Right.

What do you think is driving this distrust of new technologies?

Disruptive new technologies like social media and artificial intelligence are naturally going to make us uncomfortable. They create new ways of doing things and force societal norms to evolve. This is something that happened in the past, for example, with the camera. We’re now used to having cameras everywhere, but we had to develop norms around when, where, and how we can take pictures. With AI, we’re watching that process happen in real-time.

The good news is that we’re adapting to new technologies faster than ever. When you look at the level of adoption of technologies like ChatGPT and the comfort level that younger people have with them, innovations seem to be becoming socially acceptable at a much quicker pace than in the past.

The current technology panics are also not unique to the present. We’ve seen a lot of concern about young people and social media recently, but before that, it was young people and video games, and before that, it was magazines and comic books. We even have articles from back in the day of people complaining that young people were reading too many novels.

There’s also this fear of tech companies having too much market share. Can you walk us through that concern and provide your take on it?

I’m sure you’re talking about Myspace’s natural monopoly on social media. Or maybe you’re talking about how Yahoo won the search wars. These were very real headlines 20 years ago with a different set of technology giants. So, my first point is that innovation is our best competition policy.

My second point is that before we implement competition policy, we need to figure out why big companies are popular. If a company is popular because it’s serving its consumers well, that’s not a problem; that’s something we should be applauding. When we think about incredibly popular products like Amazon’s Prime program, people choose to engage with it because they find it beneficial.

We should really only want to see antitrust or competition policy used if anti-competitive behavior is harming consumers. We don’t want a competition policy that presumes big is bad. And we certainly don’t want to see competition policy that focuses on competitors rather than consumers. We don’t want a world where the government dictates that the Model T can’t put the horseshoe guys out of business.

People of all stripes want to restrict how private companies moderate content. People on the left are concerned about potential misinformation online, while those on the right worry about political bias in content moderation. What’s your take on this issue?

Online content moderation matters for a lot more than social media. We often think about this in the context of, “Did X take down a certain piece of content or leave up a certain piece of content?” But this is actually much bigger. Think about your favorite review site. If you travel and you’re going to a new place and looking for somewhere to stay or go to dinner, you’re probably going to go to your favorite review site rather than read what some famous travel reporter has said.

The review sites allow you to find reviewers with your same needs. Maybe you’re traveling with young children, or you have someone with dietary restrictions. This is something that only user-generated content can provide. But what about bad or unfair reviews? What happens when someone starts trying to get bad reviews taken down? We want these sites to be able to set rules that keep reviews honest, that keep the tool useful, where they’re not being overrun by spam, and they aren’t afraid of a lawsuit from someone who disagrees with a review.

This is one example of why we should be concerned about these online content moderation policies. When it comes to questions of misinformation, I think it’s important to take a step back and think, “Would I want the person I most disagree with to have the power to dictate what was said on this topic?” Because if we give the government the power to label misinformation and moderate content, the government will have that power whether or not the people you agree with are in charge. So not only do we have First Amendment concerns here in the US from a legal point of view, but we should also have some pretty big first principles concerns regarding some of these proposals.

That’s a good segue into another concern a lot of people have with new technology, which is its effect on young people. What do you make of those concerns?

Youth online safety can mean so many different things. Some people are concerned about how much time their child spends online. Some people are concerned about issues related to online predators. Others are just concerned about particular types of content that they don’t want their children exposed to. The good news is we’ve seen the market respond to a lot of these concerns, and there are a lot of tools and choices available to parents.

The first choice is just when you allow your child to use certain technology. That’s going to vary from family to family. But even once you’ve decided to allow your child to have access to a device, you can set time limits or systems that alert you to how the child is using the device. There, we have seen platforms, device makers, and civil society respond with a great deal of tools and resources for parents. To reduce harm to children, we should look to education rather than regulation. We need to empower people to make the choices that work best for them because this isn’t going to be a one-size-fits-all decision, and policy intervention will result in a one-size solution.

Many people are also concerned about privacy. Whenever there is a large gathering of data, that data can be leaked to the government or to bad actors. How should we think about data privacy?

When we talk about privacy, I think it’s important to distinguish between the government and private actors. We need very strong privacy protections against government surveillance, not only for consumers but also for the companies themselves, so that they can protect their consumers and keep the promises they’ve made to consumers regarding data privacy.

When it comes to individual companies, we need to think about the fact that there are a lot of choices when it comes to data privacy, some of which we don’t even think are data privacy choices.

One example is if you go to a website and sign up for a newsletter in order to get a ten percent off coupon, you’re technically exchanging a bit of data, such as your email address, for that 10 percent off coupon. You get a direct benefit in that moment. That’s a privacy choice you make. If we think about privacy as a choice, we start to see that we make these choices every day. Even where we choose to have a conversation is a data privacy choice.

The other element when it comes to data privacy is that an individual’s data, while we deeply care about it, is not actually that valuable. What’s been valuable is how data can be used in the aggregate to improve services. So, when we hear that we should just treat data like any other piece of property, it doesn’t necessarily work because data doesn’t act like other forms of property in many cases. Not only is the value of the data not tied to a single data point, but the data also is often not tied to a single user. This makes regulating data privacy very complicated. If you and I are in a picture together, whose data is that? Is it the person who took the picture’s or people in the picture’s? Or does it belong to the location we were in while taking the picture? Can you invoke a right to be forgotten that removes the picture? And if so, then what does that do to the person who took the picture’s speech rights? These are not easy questions, and they’re often better solved on an individual basis than with a one-size-fits-all approach.

The Human Progress Podcast | Ep. 53

Jennifer Huddleston: Digital Technology and the Regulatory State

Jennifer Huddleston, a senior fellow in technology policy at the Cato Institute, joins Chelsea Follett to discuss the benefits of digital technologies as well as how we should think about the risks and problems they pose.