Christos Makridis Christos Makridis

A Missing Link for Improving Education

This article was originally published in City Journal.

Republican presidential candidate Vivek Ramaswamy has said that “the nuclear family is the best form of governance known to mankind.” That notion has its critics, but it is increasingly shared by many across the political spectrum. Two recent books, for instance— Robert Cherry’s The State of the Black Family: Sixty Years of Tragedies and Failures and New Initiatives Offering Hope and Melissa Kearny’s The Two Parent Privilege: How Americans Stopped Getting Married and Started Falling Behind—present evidence that family dynamics influence a child’s life chances more than any other factor, including formal education. Unfortunately, state-level educational assessments and the National Assessment of Education Progress (NAEP) include no student information on family structure (for example, whether a student lives in a two-parent, single-parent, guardian, or foster household), making it harder to pursue data-driven educational interventions.

In our book, The Economics of Equity, we discuss state-level policy interventions to involve parents more effectively in their children’s education, to implement initiatives such as after-school programming targeted toward students most in need, and to help capable students of low socioeconomic status. These recommendations are especially important, given what we know about how students of low socioeconomic status spend substantially less time on educational activities outside of school. We cannot keep throwing more money at this problem; we have to address the root issue, which starts with the family. If schools could access data on family dynamics, they could craft more realistic parent-teacher-student-school responsibility agreements and create tiered intervention systems that take family capability and needs into account.

The good news is that this has been done before. For instance, one of us has written about how a school serving students from low-income families achieved Blue Ribbon status through the leadership of its principal. The principal’s key intervention was an afterschool program focusing on students with unstructured home environments. The principal was only able to make these determinations, however, thanks to teacher recommendations, not from an official database that tracked these kids’ home status.

Interventions based on students’ gender, race, class, learning disabilities, or English proficiency alone have led to many ineffective initiatives. Each of these characteristics is correlated with achievement gaps but is not their driving factor. In some cases, as with California’s new math requirements, officials are promoting initiatives that not only cost taxpayers dearly but also risk worsening achievement gaps.

Our book also summarizes the empirical evidence on charter and private schools, which have historically been better suited to parental involvement. Since parents must choose these schools, the schools can require a certain level of accountability from them. They can also create and adapt systems that meet parents’ needs without having to pass through the layers of bureaucracy and union battles common in public schools.

Unfortunately, the student data currently available to public school educators don’t help them address problems stemming from family status. Providing schools with data on family structure would give them a vital tool for addressing academic achievement gaps and improving educational outcomes.

Goldy Brown III is an associate professor at Whitworth University’s school of education, director of the university’s Educational Administration Program and a former school principal. Christos A. Makridis is a research affiliate at Stanford University, CEO/founder at Dainamic, and holds appointments in other institutions.

Read More
Christos Makridis Christos Makridis

The potential of AI systems to improve teacher effectiveness in Spokane public schools

This article was originally published in the Spokesman-Review (with Goldy Brown).

The United States K-12 education system has faced challenges for years, but has faced even greater headwinds recently following the pervasiveness of school closures and the resulting effects on student mental health and learning outcomes. Student test scores in math and reading fell to their lowest levels in 2022, according to the National Assessment of Educational Progress. These deteriorating outcomes warrant effective instruction from teachers across classrooms.

This year, Spokane Public Schools announced that it is pioneering a novel approach to evaluating and improving teacher effectiveness using AI systems. While AI is sometimes thought of as displacing jobs, it can also augment our productivity and learning. And as this school district in Spokane is exploring, AI systems can potentially help lower-performing teachers improve their quality of instruction at scale and embed greater consistency into teaching nationwide.

School districts often struggle with limited resources to provide continuous, quality training for their teachers, and bureaucratic impediments make removing ineffective teachers an arduous process. As a result, many students suffer under the instruction of teachers who, despite their best intentions, are ill-equipped to meet their educational needs. A large body of empirical research, in part led up by professor Eric Hanushek at the Hoover Institution, has pointed out that teacher quality is the single greatest impediment to learning outcomes.

The recent advances in large language models, such as Bard and ChatGPT, highlight the ways that AI can improve training and assessment of teachers at scale without having to involve principals and other training professionals for each individualized case. In particular, AI-powered platforms can provide a personalized, data-driven approach to teacher training.

By analyzing classroom data and building statistical models that predict learning outcomes as a function of teacher characteristics and inputs, these systems can offer real-time feedback and guidance, addressing teachers’ specific areas of weakness and offering them ways to improve. For example, if a teacher consistently struggles with engaging students or explaining complex topics, the AI could provide tailored strategies and methods to improve in these areas.

Moreover, AI-based coaching systems offer scalability and efficiency that traditional teacher training programs cannot match. Such a system can serve numerous teachers simultaneously, providing continuous support and learning opportunities. This continuous feedback loop would allow teachers to refine their skills constantly and adapt their teaching styles to their students’ evolving needs. Furthermore, AI systems would avoid putting further strain on the educational system that has already been stretched thin post-COVID.

While the potential benefits of AI in teacher coaching are vast, successfully implementing such a system requires careful consideration. An essential aspect of managing these AI systems is ensuring they are ethically used and respect teachers’ and students’ privacy. Confidentiality of data is paramount, and AI systems must be designed and regulated to ensure they comply with laws and ethical guidelines pertaining to data protection.

For example, our recent book, “The Economics of Equity in K-12 Education: A Post-Pandemic Policy Handbook for Closing the Opportunity Gap and Using Education to Improve the American Economy,” prominently features recommendations by professor Ryan Baker and University of Pennsylvania, emphasizing that the use of AI in education will require data sharing between schools and vendors using the latest advances in cryptography, like zero knowledge proofs, to secure sensitive information.

Additionally, AI systems need regular fine-tuning to remain effective and relevant. This process would involve an ongoing cycle of feedback from teachers, AI developers and education experts to ensure the AI evolves in line with the changing dynamics of classrooms and the educational landscape. For instance, updates in curriculum, pedagogical strategies and teaching methods should be reflected in the AI’s feedback and coaching suggestions.

Ultimately, AI is a tool, rather than a replacement for human connection and judgment: Decisions must remain with the educators and administrators. AI can provide data-driven insights and recommendations, but it’s the teachers and administrators who will interpret this information in the context of their unique classroom environments and make the final decisions.

The pioneering work by Spokane Public Schools represents a novel attempt to solve the longstanding challenge of the deterioration in student learning outcomes driven, at least in large part, by the decline in teacher quality and absence of incentives. With careful management and continuous refinement, AI systems could revolutionize teacher coaching, significantly improving the quality of K-12 education across the nation.

While challenges remain, the path forward shows immense promise, offering hope to educators and students alike.

Goldy Brown III is an associate professor in the Graduate School of Education at Whitworth University in Spokane . He is also the director of Whitworth University’s Educational Administration Program. He was a school principal for almost a decade. Schools that he administered earned four state recognition awards for closing the achievement gap between low-income and affluent students.

Christos A. Makridis is a research affiliate at Stanford University’s Digital Economy Lab and COO/co-founder of Living Opera, a multimedia startup focused on empowering and educating performing artists. He holds doctorates and masters degrees in economics and management science and engineering from Stanford University.

Read More
Christos Makridis Christos Makridis

Should we ban ransomware payments? It’s an attractive but dangerous idea

This article was originally published on Cointelegraph Magazine.

A successful cyberattack on critical infrastructure — such as electricity grids, transportation networks or healthcare systems — could cause severe disruption and put lives at risk. 

Our understanding of the threat is far from complete since organizations have historically not been required to report data breaches, but attacks are on the rise according to the Privacy Rights Clearinghouse. A recent rule from the United States Securities and Exchange Commission should help clarify matters further by now requiring that organizations “disclose material cybersecurity incidents they experience.”

As the digital world continues to expand and integrate into every facet of society, the looming specter of cyber threats becomes increasingly more critical. Today, these cyber threats have taken the form of sophisticated ransomware attacks and debilitating data breaches, particularly targeting essential infrastructure.

A major question coming from policymakers, however, is whether businesses faced with crippling ransomware attacks and potentially life threatening consequences should have the option to pay out large amounts of cryptocurrency to make the problem go away. Some believe ransoms be banned for fear of encouraging ever more attacks. 

Following a major ransomware attack in Australia, its government has been considering a ban on paying ransoms. The United States has also more recently been exploring a ban. But other leading cybersecurity experts argue that a ban does little to solve the root problem.

Ransomware and the ethical dilemma of whether to pay the ransom

At the most basic level, ransomware is simply a form of malware that encrypts the victim’s data and demands a ransom for its release. A recent study by Chainalysis shows that crypto cybercrime is down by 65% over the past year, with the exception of ransomware, which saw an increase. 

“Ransomware is the one form of cryptocurrency-based crime on the rise so far in 2023. In fact, ransomware attackers are on pace for their second-biggest year ever, having extorted at least $449.1 million through June,” said Chainalysis.

Even though there has been a decline in the number of crypto transactions, malicious actors have been going after larger organizations more aggressively. Chainalysis continued:

Big game hunting — that is, the targeting of large, deep-pocketed organizations by ransomware attackers — seems to have bounced back after a lull in 2022. At the same time, the number of successful small attacks has also grown.”

The crippling effect of ransomware is especially pronounced for businesses that heavily rely on data and system availability.

The dilemma of whether to pay the ransom is contentious. On one hand, paying the ransom might be seen as the quickest way to restore operations, especially when lives or livelihoods are at stake. On the other hand, succumbing to the demands of criminals creates a vicious cycle, encouraging and financing future attacks.

Organizations grappling with this decision must weigh several factors, including the potential loss if operations cannot be restored promptly, the likelihood of regaining access after payment, and the broader societal implications of incentivizing cybercrime. For some, the decision is purely pragmatic; for others, it’s deeply ethical.

Should paying ransoms be banned?

The increasing incidence of ransomware attacks has ignited a policy debate: Should the payment of ransoms be banned? Following a major ransomware attack on Australian consumer lender Latitude Financial, in which millions of customer records and IDs were stolen, some have begun to advocate for a ban on paying the ransom as a way of deterring attacks and depriving cybercriminals of their financial incentives. 

In the United States, the White House has voiced its qualified support for a ban. “Fundamentally, money drives ransomware and for an individual entity it may be that they make a decision to pay, but for the larger problem of ransomware that is the wrong decision… We have to ask ourselves, would that be helpful more broadly if companies and others didn’t make ransom payments?” said Anne Neuberger, deputy national security advisor for cyber and emerging technologies in the White House.

While proponents argue that it will deter criminals and reorient priorities for C-suite executives, critics, however, warn that a ban might leave victims in an untenable position, particularly when a data breach could lead to loss of life, as in the case of attacks on healthcare facilities.

“The prevailing advice from the FBI and other law enforcement agencies is to discourage organizations from paying ransoms to attackers,” Jacqueline Burns Koven, head of cyber threat intelligence for Chainalysis, tells Magazine.

“This stance is rooted in the understanding that paying ransoms perpetuates the problem, as it incentivizes attackers to continue their malicious activities, knowing that they can effectively hold organizations hostage for financial gain. However, some situations may be exceptionally dire, where organizations and perhaps even individuals face existential threats due to ransomware attacks. In such cases, the decision to pay the ransom may be an agonizing but necessary choice. Testimony from the FBI recognizes this nuance, allowing room for organizations to make their own decisions in these high-stakes scenarios, and voiced opposition to an all out ban on payments.” 

Another complicating factor is that an increasing number of ransomware attacks, according to Chainalysis, may not have financial demands but instead focus on blackmail and other espionage purposes. 

“In such cases, there may be no feasible way to pay the attackers, as their demands may go beyond monetary compensation… In the event that an organization finds itself in a situation where paying the ransom is the only viable option, it is essential to emphasize the importance of reporting the incident to relevant authorities.” 

“Transparency in reporting ransomware attacks is crucial for tracking and understanding the tactics, techniques and procedures employed by malicious actors. By sharing information about attacks and their aftermath, the broader cybersecurity community can collaborate to improve defenses and countermeasures against future threats,” Koven continues.

Could we enforce a ban on paying ransomware attackers?

Even if a ban were implemented, a key challenge is the difficulty in enforcing it. The clandestine nature of these transactions complicates tracing and regulation. Furthermore, international cooperation is necessary to curb these crimes, and achieving a global consensus on a ransom payment ban might be challenging. 

While banning ransom payments could encourage some organizations to invest more in robust cybersecurity measures, disaster recovery plans and incident response teams to prevent, detect and mitigate the impact of cyberattacks, it still amounts to penalizing the victim and making the decision for them.

“Unfortunately, bans on extortions have traditionally not been an effective way to reduce crime — it simply criminalizes victims who need to pay or shifts criminals to new tactics,” says Davis Hake, co-founder of Resilience Insurance who says claims data over the past year shows that while ransomware is still a growing crisis, some clients are already taking steps toward becoming more cyber-resilient and able to withstand an attack. 

“By preparing executive teams to deal with an attack, implementing controls that help companies restore from backups, and investing in technologies like EDR and MFA, we’ve found that clients are significantly less likely to pay extortion, with a significant number not needing to pay it at all. The insurance market can be a positive force for incentivizing these changes among enterprises and hit cybercriminals where it hurts: their wallets,” Hake continues.

The growing threat and risk of cyberattacks on critical infrastructure

The costs of ransomware attacks on infrastructure are often ultimately borne by taxpayers and municipalities that are stuck with cleaning up the mess.

To understand the economic effects of cyberattacks on municipalities, I released a research paper with several faculty colleagues, drawing on all publicly reported data breaches and municipal bond market data. In fact, a 1% increase in the county-level cyberattacks covered by the media leads to an increase in offering yields ranging from 3.7 to 5.9 basis points, depending on the level of attack exposure. Evaluating these estimates at the average annual issuance of $235 million per county implies $13 million in additional annual interest costs per county.

One reason for the significant adverse effects of data breaches on municipalities and critical infrastructure stems from all the interdependencies in these systems. Vulnerabilities related to Internet of Things (IoT) and industrial control systems (ICS) increased at an “even faster rate than overall vulnerabilities, with these two categories experiencing a 16% and 50% year over year increase, respectively, compared to a 0.4% growth rate in the number of vulnerabilities overall, according to the X-Force Threat Intelligence Index 2022 by IBM.

A key factor contributing to this escalating threat is the rapid expansion of the attack surface due to IoT, remote work environments and increased reliance on cloud services. With more endpoints to exploit, threat actors have more opportunities to gain unauthorized access and wreak havoc. 

“Local governments face a significant dilemma… On one hand, they are charged with safeguarding a great deal of digital records that contain their citizens’ private information. On the other hand, their cyber and IT experts must fight to get sufficient financial support needed to properly defend their networks,” says Brian de Vallance, former DHS assistant secretary.

Public entities face a number of challenges in managing their cyber risk — the top most is budget. IT spending accounted for less than 0.1% of overall municipal budgets, according to M.K. Hamilton & Associates. This traditional underinvestment in security has made it more and more challenging for these entities to obtain insurance from the traditional market.”

Cybersecurity reform should involve rigorous regulatory standards, incentives for improving cybersecurity measures and support for victims of cyberattacks. Public-private partnerships can facilitate sharing of threat intelligence, providing organizations with the information they need to defend against attacks. Furthermore, federal support, in the form of resources or subsidies, can also help smaller organizations – whether small business or municipalities – that are clearly resource constrained so they have funds to invest more in cybersecurity. 

Toward solutions

So, is the solution a market for cybersecurity insurance? A competitive market to hedge against cyber risk will likely emerge as organizations are increasingly required to report material incidents. A cyber insurance market would still not solve the root of the problem: Organizations need help becoming resilient. Small and mid-sized businesses, according to my research with professors Annie Boustead and Scott Shackelford, are especially vulnerable.

“Investment in digital transformation is expected to reach $2T in 2023 according to IDC and all of this infrastructure presents an unimaginable target for cybercriminals. While insurance is excellent at transferring financial risk from cybercrime, it does nothing to actually ensure this investment remains available for the business,” says Hake, who says there is a “huge opportunity” for insurance companies to help clients improve “cyber hygiene, reduce incident costs, and support financial incentives for investing in security controls.” 

Encouragingly, Hake has noticed a trend for more companies to “work with clients to provide insights on vulnerabilities and incentivize action on patching critical vulnerabilities.”

“One pure-technology mitigation that could help is SnapShield, a ‘ransomware activated fuse,’ which works through behavioral analysis,” says Doug Milburn, founder of 45Drives. “This is agentless software that runs on your server and listens to traffic from clients. If it detects any ransomware content, SnapShield pops the connection to your server, just like a fuse. Damage is stopped, and it is business as usual for the rest of your network, while your IT personnel clean out the infected workstation. It also keeps a detailed log of the malicious activity and has a restore function that instantly repairs any damage that may have occurred to your data,” he continues.

Ransomware attacks are also present within the crypto market, and there is a growing recognition that new tools are needed to build on-chain resilience. “While preventative measures are important, access controlled data backups are imperative. If a business is using a solution, like Jackal Protocol, to routinely back up its state and files, it could reboot without paying ransoms with minimal losses,” said Eric Waisanen, co-founder of Astrovault.

Ultimately, tackling the growing menace of cyber threats requires a holistic approach that combines policy measures, technological solutions and human vigilance. Whether a ban on ransom payments is implemented, the urgency of investing in robust cybersecurity frameworks cannot be overstated. As we navigate an increasingly digital future, our approach to cybersecurity will play a pivotal role in determining how secure that future will be.

Emory Roane, policy counsel at PRCD, says that mandatory disclosure of cyber breaches and offering identity theft protection services are essential, but it “still leaves consumers left to pick up the pieces for, potentially, a business’ poor security practices.”

But the combination of mandatory disclosure and the threat of getting sued may be the most effective. He highlights the California Consumer Privacy Act.

“It provides a private right of action allowing consumers to sue businesses directly in the event that a business suffers a data breach that exposes a consumer’s personal information and that breach was caused by the business’ failure to use reasonable security measures,” Roane explains. That dovetails with a growing recognition that data is an important consumer asset that has long been overlooked and transferred to companies without remuneration.

Greater education around cybersecurity and data sovereignty will not only help consumers stay alert to ongoing threats — e.g., phishing emails — but also empower them to pursue and value more holistic solutions to information security and data sharing so that the incidence of ransomware attacks is lower and less severe when they do happen.

Bans rarely work, if for no other reason than enforcement is either physically impossible or prohibitively expensive. Giving into ransoms is not ideal, but neither is penalizing the entity that is going through a crisis. What organizations need are better tools and techniques – and that is something that the cybersecurity industry, in collaboration with policymakers, can help with through new technologies and the adoption of best practices.

Read More
Christos Makridis Christos Makridis

Data as Currency

This article was originally published in Wall Street Journal (with Joel Thayer).

America’s antitrust policies are stuck in the 1980s. That was when courts and regulators began relying on what’s called the consumer-welfare standard. Articulated in Robert Bork’s 1978 book, “The Antitrust Paradox,” the standard replaced classical antitrust analysis, which focused primarily on promoting competition. Courts and regulators are supposed to take into account a variety of consumer benefits, including lower prices, increased innovation and a better product quality.

But scholars, courts and regulators have ignored Bork’s multifaceted tests and obstinately focused on price alone. The result, 40 years later, is that a few tech giants have been able to dominate the market. The problem is that their offering of free services presents a new challenge for measuring anticompetitive harm and consumer welfare. If price alone is our measure, it’s hard to argue that free services are bad for consumers.

Legal analysts have difficulties applying nonprice factors to tech companies even when confronted with such demonstrations of monopoly as viewpoint-based censorship and imposing rents on developers of apps and ad tech—or even such demonstrations of actual consumer harm as privacy violations or pass-through costs on digital goods.

These tech platforms have enabled instant communication, e-commerce, information search and political engagement. In exchange for these services, customers provide data. In a new working paper, we argue that data is the new currency that these tech behemoths are capitalizing. Every click, every interaction and every transaction feeds the digital economy.

In this light, the concept of free services is misleading, because consumers do pay a price by giving away their data. Worse, they do so often without understanding the full implications. These facts demand recalibration of the consumer-welfare standard to protect consumers’ rights and promote competitive markets. Data is more than just a digital footprint. It is a resource that tech companies exploit to amass control and wealth. The power dynamics in this exchange remain unbalanced, with consumers often unaware of the value of their data.

Some courts and scholars have argued that these harms are speculative and difficult to quantify. But there is a metric by which we can more accurately measure whether consumer welfare is served by tech companies: the amount of data they collect in exchange for those free services. Our paper explains several methods for deriving the value of data, especially from financial markets and structural methods. In general, these methods look at the role data plays in the production of goods and services.

Google, for example, required few data points from users when it made its search service available in 1997. Today it requires near-constant access to its users’ geolocation, spending habits and time spent on other sites. A judge could evaluate whether Google is arbitrarily requiring its users to provide more data—akin to raising the price of a product—solely to avail itself of ad revenues and more market share. To do so would be to engage in anticompetitive behavior. Antitrust law doesn’t allow this type of behavior in any other context.

Consumers run the risk and gain little new benefit every time tech companies pilfer more data from them. Even with the increase in data they obtain, the quality of their services remains virtually unchanged. These companies collect this data with few safeguards. And thanks to their buying out or merging with other companies, they lack any meaningful competitors.

Big Tech has, in effect, made data a new currency, which functions as the basis of many Big Tech companies’ business models. In the face of today’s data-driven digital markets, the fact that data is currency should compel us to revisit how we think about antitrust harm and what constitutes a competitive tech market.

Read More
Christos Makridis Christos Makridis

Men over 45 are working fewer hours. New research

This article was originally published in Fast Company.

There are no shortages of anecdotes when it comes to people sharing strong opinions about remote work and its effects on productivity and the tendency to slack off. These narratives are important, but they may not tell the whole story. Fortunately, newly available data from the American Time Use Survey (ATUS) by the Bureau of Labor Statistics provides some insight.

MEASURING TIME SPENT IN DIFFERENT ACTIVITIES

The ATUS is the only federal survey providing data on the full range of nonmarket activities, including the amount of time people spend on paid work, childcare, volunteering, and socializing. Individuals in the ATUS are drawn from the sample of respondents in the Current Population Survey as they are exiting.

One of the major benefits of the ATUS is that it measures a wide array of activities, not just time at work, like many existing surveys. This allowed me in my research to differentiate between work, leisure, household chores, childcare, and more.

Another major benefit of the ATUS is that it collects detailed 24-hour time diaries in which respondents report all the activities from the previous day in time intervals. These records are not only more detailed but also more reliable than standard measures of time allocated to work available in other federal datasets that require respondents to recall how much they worked over the previous year or week. These diaries contain much less noise than typical survey results.

UNCOVERING CHANGES IN TIME USE AMONG REMOTE WORKERS

Drawing on ATUS data from 2019 to 2022 among employed workers between the ages of 25 and 65, my new research paper documents new trends on time use, distinguishing between those in more- versus less-remote work jobs.

To measure remote work, I use an index by professors Jonathan Dingel and Brent Neiman at the University of Chicago, reflecting the degree to which tasks in a given occupation can be done remotely versus in person.

WORK TIME SHRUNK BY NEARLY AN HOUR

The first main result is that time allocated to work activities declined by nearly an hour among remote workers in 2022, relative to their 2019 trend before the pandemic, and time allocated toward leisure grew by about 40 minutes. The remainder of the time appears to have gone toward activities that are not otherwise classified, which might reflect scrolling on social media.

Your first instinct might be that time at work, of course, declined, but that’s because people are simply spending less on their commutes. While that is true, it doesn’t explain the sustained decline in time at work and increase in leisure from 2020 to 2022.

Furthermore, I ran separate models to differentiate between “pure work” and “work-related activities”—the latter including travel time to work. All of the changes in time at work come from “pure work,” rather than other categories related to travel or other income-generating activities.

But what’s even more striking is that the decline in work and rise in leisure is concentrated among males, singles, and those without children. In fact, single males over the age of 45 in remote jobs experienced a nearly two-hour decline in time allocated to work in 2022, relative to 2019, and over an hour increase in time allocated to leisure. This demographic divergence demonstrates the heterogeneity in responses to remote work.

Compare these patterns with those among women and caregivers. I found that college-educated women allocated an additional 50 minutes per day to work in 2022, relative to 2019. Among non-college-educated women, there were no statistically significant changes. I also found a nearly 30-minute-per-day increase in work among women with children. At least some of that increase in work is coming from a decline in home production activities, such as taking care of children and doing chores around the house, among the college-educated women.

IMPLICATIONS FOR PRODUCTIVITY AND THE LABOR MARKET

Do these results on remote work—especially for single males—simply reflect the phenomenon of quiet quitting, where employees disengage from work while remaining employed?

While more research is needed, the short answer appears to be no. In fact, I found that remote workers reported higher satisfaction with their lives and felt better rested. Remote workers also did not report more time allocated toward interviewing for other jobs. Cumulatively, these facts imply that changes in time use—at least since 2019—are not driven by disengagement.

These results have important implications for the debate about productivity. My other research has found that hybrid work arrangements may offer the best of both worlds.

For example, my research with Jason Schloetzer at Georgetown University using data from Payscale shows that the positive relationship between remote work and job satisfaction is statistically significant for hybrid workers only after accounting for differences in corporate culture. And even then, corporate culture dwarfs the economic significance of remote work.

Similarly, my work with Raj Choudhury, Tarun Khanna, and Kyle Schirmann at Harvard Business School using data from a randomized experiment in Bangladesh shows that workers on a hybrid schedule—working some days at home and some in the office—are more creative, send more emails, and feel like they have a better work-life balance relative to their fully remote or fully in-person peers.

It’s clear that remote work is not a one-size-fits-all phenomenon. While there are many benefits of remote work that come in the form of breaking down barriers and heightened flexibility, there are also new challenges that must be managed.

Crucially, we must be responsible to put into practice the right habits and processes to manage our time so that it does not drift away. Business leaders should help inculcate a culture of excellence by focusing on outcomes—not simply measures of hours worked—and lead by example.

Read More
Christos Makridis Christos Makridis

The transformative role of water markets for a climate-changed future

This article was originally published in the Global Water Forum.

Water markets provide a mechanism for the efficient allocation of water resources based on market principles. In a water market, water rights can be bought and sold, allowing water to flow from areas of low value to areas of high value. Could this mechanism also play a significant role in addressing the challenges of transboundary water governance?

Enhancing efficiency

Newly released research published in the American Economic Review by Professor Will Rafey at the University of California Los Angeles provides valuable insights into the functioning and benefits of water markets (Rafey, 2023). Drawing on data from the largest water market in history, located in southeastern Australia, Rafey finds that water trading increased output by 4-6% from 2007 to 2015, equivalent to avoiding an 8-12% uniform decline in water resources. This indicates that water markets can significantly enhance the efficiency of water allocation and usage.

While there is a large body of research attempting to estimate the value of trading water rights, most studies have run into at least three challenges. First, there are practical realities that are tough to model with river systems, such as the costly and uncertain flow constraints. Second, there are also geographic and hydrological constraints, including changes in the ecosystem and climate that affect supply and behavior. Third, the set of feasible trades in the water network are subject to many constraints, such as the cost of moving water and the direction it flows.

Rafey takes a two-step approach that begins by estimating the production functions for water, which map irrigation volumes into agricultural output using producer-level longitudinal data on irrigation, physical output, and local rainfall. To address the traditional concern that some farms might be systematically more productive than others, thereby confounding the relationship between inputs and outputs due to unobserved differences, Rafey leverages the longitudinal nature of the data and the heterogeneity in how water sharing rules, also known as diversion formulas, evolve nonlinearly across space and time. Crucially, these diversion rules are not within the control of any individual farm, so they provide an external stimulus to study how output evolves. Then, Rafey links the water trading data with the production functions to estimate the realized value of trades, thereby sidestepping having to parameterize and specify the set of feasible trades and all the many constraints that go into water systems.

Policy implications

Rafey’s research is important for both methodological reasons and policy guidance. Methodologically, it shows how to estimate the value of trading in a setting where there is substantial stochasticity, absence of a complete market, and dynamic game-theoretic interactions without having to specify all these ingredients explicitly in the model. Instead, the two-step approach allows him to flexibly estimate the value of water trading.

In respect of policy, his results suggest:

  • There is growing institutional, including governmental, support for water markets. While market power and other frictions may exist, water markets have been proven to raise allocative efficiency. The estimated total gains from trade provide a lower bound on the value of maintaining the infrastructure required for water markets.

  • Australia’s experience with setting up and running water markets provides a template for other countries. They demonstrate that efficiency gains are possible using modern monitoring technology in an arid region. The extent of a river system’s underlying hydrological variability, which can be measured directly from historical river inflows and rainfall, is identified as an important source of water markets’ prospective value.

  • Especially in the presence of climatic change, water rights can play a substantial role in facilitating adaptation. Efficient annual trade should reallocate water from places of relative abundance to places of relative scarcity, lowering the costs of idiosyncratic variability across the river trading network. By increasing the productive efficiency of a basin’s aggregate water endowment, a water market makes drier years less costly, helping irrigators adapt to aggregate shocks. “Without water reallocation through the annual market, output would fall by the same amount as if farms faced a uniform reduction in water resources of 8–12 percent. By comparison, government climate models for this region predict surface water resources to decline by 11 percent in the median year under a 1°C increase in temperature by 2030,” said Rafey.

Although we have long known that water markets are important mechanisms for ensuring the efficient allocation of water resources, we have not known how much and how they depend on different conditions, such as varying diversion rules and a changing climate. This research provides the latest comprehensive evaluation on the importance of water markets and their value in the years ahead to help manage scarce resources in a stochastic world.

The role of water markets in transboundary governance?

Transboundary water governance is a complex social, political, and economic issue involving the management and allocation of water resources across political boundaries. It is a critical aspect of international relations, as water is a vital resource that is unevenly distributed across the globe. The governance of these resources is fraught with at least two major challenges.

First, water is a shared resource that does not respect political boundaries. Rivers, lakes, and aquifers often span across multiple countries, making it challenging to manage and allocate these resources equitably. Furthermore, the governance of transboundary water resources involves a multitude of stakeholders (eg, governments, local communities, non-governmental organizations, and private entities) each with different interests, priorities, and perceptions of how water resources should be managed, leading to conflicts and disagreements.

Second, the governance of transboundary water resources is further complicated by climate change, population growth, economic development, and more. These factors increase the demand for water and exacerbate the challenges of managing and allocating these resources.

The creation of water markets has the potential to help water managers meet these challenges by allocating supply and demand efficiently and quickly without central planning and in the face of a wide array of uncertainty, ranging from climatic change to macroeconomic shocks. Water managers and policymakers across the world should work together to build upon the successful lessons learned from Australia’s example in the Murray-Darling Basin.

Read More
Christos Makridis Christos Makridis

Single, Remote Men Are Working Less

This article was originally published in City Journal.

The Covid-19 pandemic utterly transformed the world of work. But while employees across the globe have adapted to conducting business from their living rooms, CEOs and business leaders have struggled with this seismic shift, openly voicing their concerns about the impact of remote work on productivity, employee engagement, and corporate culture.

Some business leaders have come out strongly against working from home. “Remote work virtually eliminates spontaneous learning and creativity, because you don’t run into people at the coffee machine,” said Jamie Dimon, CEO of JPMorgan Chase. Others are more optimistic: “People are more productive working at home than people would have expected,” said Mark Zuckerberg, CEO of Facebook. And still others remain cautious: “Working from home makes it much harder to delineate work time from personal time. I encourage all of our employees to have a disciplined schedule for when you will work, and when you will not, and to stick to that schedule,” said Dan Springer, CEO of DocuSign.

But what do the data actually say? I recently released a paper, “Quiet Quitting or Noisy Leisure? The Allocation of Time and Remote Work, 2019-2022,” which documents trends by drawing on the latest data from the Bureau of Labor Statistics’ American Time Use Survey (ATUS).

Since there is no direct measure of fully remote, hybrid, or fully in-person work arrangements in the ATUS, I focus on an index, introduced in 2020 by the University of Chicago’s Jonathan Dingel and Brent Neiman, that measures the degree to which tasks within an occupation can be done remotely.  The index also happens to do a good job of identifying what sorts of jobs people are probably working remotely in—with the caveat that an employee at a company in Texas could differ in their work arrangement from a New York worker with the same occupation but a different employer.

I discovered three things. First, remote workers allocated roughly 50 minutes less per day to work activities and 37 more minutes per day to leisure activities in 2022, relative to 2019. Time allocated to home production, such as chores and caring for other household members, did not change.

Second, and perhaps more importantly, these declines are concentrated among males, singles, and those without children. In fact, single males over the age of 45 working remotely spend more than two hours less per day in work activities in 2022, relative to 2019. If anything, college-educated females are the ones who have increased their time at work slightly.

Third, changes in the allocation of time cannot be explained by job-search activity or declines in well-being. If these declines in labor hours were driven by “quiet quitting,” then remote workers would be spending more time searching for other jobs or would feel worse about life overall.

These findings underscore the complexity of the remote-work revolution. It is not merely a binary shift from the office to the home but a complex reordering of our daily lives with far-reaching implications. For businesses, understanding these changes—and especially recognizing the challenges that different demographic brackets are struggling with—is critical for managing workforce expectations and productivity. As we navigate this new landscape, it’s essential to look beyond the surface-level changes and grapple with the deeper shifts in how we allocate our time.

Read More
Christos Makridis Christos Makridis

How Will the Rise of AI Impact White-Collar Jobs?

This article was originally published in Nasdaq.

There is growing fear that AI and other new emerging technologies will destabilize white-collar jobs. What are your thoughts?

While it is true that generative AI can displace many tasks — and that's true for any technology — the big question is what new tasks and workflows does it enable? Recently released research that I've conducted introduces an occupational measure of how much coordination is required within a job and how that relates with exposure to ChatGPT based on a new index that OpenAI came out with.

Occupations requiring more coordination have higher wages and are actually less exposed to generative AI, suggesting that generative AI might actually play an important role in breaking down barriers and easing the completion of complex work.

I've also published research looking at how the expansion of AI jobs within cities has impacted the average well-being, and found the effect has been positive particularly in cities with more professional services. The reason? Increases in productivity, such as real GDP and income.

But ultimately how these new technologies will affect employment and wages is a policy design choice. If governments impose heavy regulation and high taxes on labor, that encourages companies to substitute away from human labor towards capital to save on costs. And that's what we see in many European countries — high labor tax rates are correlated with higher capital to labor ratios, and that in turn leads to lower labor productivity growth and less of the total surplus in an economy going towards labor.

How will these technologies transform or impact white-collar jobs?

There is an open question about how generative AI will affect productivity, and whether it may accelerate income inequality and polarization in the labor market. Preliminary evidence from OpenAI indicates that 80% of the U.S. workforce could have at least 10% of their work tasks affected by the introduction of large language models LLMs), and 19% of workers may see at least 50% of their tasks impacted.

However, my research offers an optimistic view of generative AI by showing that occupations requiring greater degrees of coordination over complex work are less – not more – exposed to displacement by generative AI. In other words, generative AI might actually end up serving as a complement to labor in occupations requiring greater degrees of coordination. Since so much work requires tacit information that is not easy to formalize and systematize, large language models can digest vast quantities of information and convert it into actionable recommendations and instructions for other team members to review and act upon.

How can consumers and business best prepare for the rise of AI technology? What are some advantages and what are some risks?

My research has highlighted the importance of intellectual tenacity as a personality trait in becoming resilient to technological change. That requires perseverance and curiosity to thrive amid challenges and continue learning even after formal schooling ends. We have so many tools at our disposal for living more effectively and productively, but sometimes inertia keeps us doing things business as usual. A practical suggestion is to allocate some time every week to evaluating the inventory of work and strategizing internally — or even with ChatGPT as a sparring partner — about how to work smarter.

As demand and use cases around generative AI grows, what should investors keep in mind?

Investors should think about breakthrough innovations, rather than the marginal ones. The highest value companies, ranging from Tesla to Apple, were the companies that did things people thought were impossible. That means having a great understanding of pain points among consumers and an eye for solutions that are just crazy enough to work.

Investors should also place a premium on companies that have versatile and experienced management: startups and young founders can be great, and there are certainly many who succeed, but there are also many more who fail due to lack of experience and hubris that should prompt investors to be prudent when evaluating a team's likelihood of success in execution (and not just the novelty of the idea).

How will these new technologies impact different sectors?

Each sector has its own challenges and pain points. On one hand, healthcare is fraught with an insurance sector that charges high premiums and delivers low customer service, coupled with pharmaceutical companies that have a tendency to over medicate rather than encourage preventative behavior. In this sense, AI has the potential to personalize behavioral recommendations that are likely to improve health and well-being, as well as automate otherwise mundane and time intensive activities among insurers that would allow them to pass cost savings onto their customers.

On the other hand, education has been increasingly failing to deliver for students, as evidence by K12 math and reading test scores reaching their lowest levels in 2022 to a flattened college wage premium over the past 15 years.

AI has the potential to transform the delivery of educational services by personalizing learning to each person's unique learning styles. AI can also help break down barriers that may have traditionally discouraged an individual from continuing education. In short, AI is a general- purpose technology, and it will affect each sector differently.

Read More
Christos Makridis Christos Makridis

An Enduring Need for Choice

This article was originally published in City Journal with Goldy Brown III.

This year marks the 40th anniversary of the publication of “A Nation at Risk.” Released by Ronald Reagan’s education secretary Terrel Bell and prompted by the international underperformance of American students, the report challenged schools to make dramatic improvements. Education policy was to focus on standards, accountability, and equity. Yet after four decades and countless new initiatives, the record shows that choice is a critical, but neglected, factor for success.

Every state has some form of academic standards, but these alone have not improved student achievement or closed achievement gaps. The standards movement began to take shape in 1989, when President George H. W. Bush convened governors at the Charlottesville Education Summit to discuss educational programming. Bill Clinton would later introduce his education initiative, “Goals 2000,” when he became president in 1992, requiring states to make high school graduation requirements more rigorous. The most recent installment came in the form of Common Core, whose prescriptions were approved by 40 states only to be repealed by many.

The push for accountability and equity, meantime, ramped up in 2001, when closing the achievement gap became a federal mandate in George W. Bush’s 2001 No Child Left Behind law. The policy obliged schools to show “adequate yearly progress” on statewide reading and math assessments for all students and to close the gap between certain subgroups. Schools failing to meet these objectives were sanctioned. The Obama administration then mandated that states adopt Common Core standards and their own accountability measures in order to receive Race to the Top Program funding.

At the end of this 40-year effort, what has changed? Compared with other countries, the U.S. is not gaining academically. Domestically, gaps between racial and income groups have not just persisted but widened. In fact, student test scores in math and reading fell to their lowest levels in 2022, according to the National Assessment of Educational Progress. If the U.S.’s decentralized approach to education is going to be an asset, we need to learn what works and what doesn’t. Though states would seem to be a perfect laboratory for such healthy experimentation, things have often not worked out that way.

Our recent book, The Economics of Equity in K-12 Education, establishes best practices for states and local governments. Besides the family, the local community has the most significant influence over a child’s education and future. Local stakeholders make decisions about teacher pay, teacher training, curriculum, programming for local students, and budgets. These decisions vary by district.

Yet national trends have further stifled progress on local education policy. Our research finds that school closures led to a deterioration in parental mental health that ultimately affected students—even beyond the learning losses arising from remote instruction—and that many families decided to switch permanently to homeschooling, even after schools began reopening. That so many families decided to homeschool highlights the increasing preference for school choice. Indeed, until policymakers confront the unambiguous evidence that school choice can improve learning outcomes and close the achievement gap, they will repeat the same mistakes we’ve seen over the last 40 years.

What defense do children have from school boards, unions, or other forces that fail to look out for their best interests? A child’s education is one of the most critical indicators of future success. Families need options, especially now that the pandemic has subsided. More states are concluding that school choice is necessary; nationally, we need to expand the role of choice in educational policy. Choice is the only policy that can address our biggest challenge: helping a decentralized system meet the needs of our pluralistic nation’s population.

Read More
Christos Makridis Christos Makridis

Gary Vaynerchuk: Pop Culture And Innovation's Role In Business Growth

This article was originally published in Forbes.

Entrepreneur and social media icon Gary Vaynerchuk is redefining the rules of the game bringing culture, innovation, and business together. Building on a huge success from last year’s VeeCon 2022 held in Minneapolis, Vaynerchuk is going at it again, this time in Indianapolis between May 18-20 with a new lineup of speakers and talks.

Access to VeeCon, however, is tokengated for the VeeFriends community, the holders of his NFT collection consisting of 10,255 tokens. “Eventually, we will all interact with NFTs because they will be our airline tickets, membership cards, and more,” Vaynerchuk said.

Vaynerchuk's interest in digital assets has gone beyond mere fascination, seeing them as a pivotal mechanism for societal interaction and a transformative tool for brand-consumer relationships. These digital assets, unique by design and stored on the blockchain, offer a new way for businesses to engage their consumers, especially those deeply ingrained in the digital landscape. Vayner3 has been working with many brands to integrate Web3 strategies into their business model in a way that drives business efficiency and customer engagement.

A good example of their web3 strategy work is their collaboration with Crown Royal, advising and helping with the execution of their digital collectibles launch in November 2022. In particular, their “Chain of Gratitude” initiative aimed to spread generosity following Super Bowl Sunday, so they launched digital collectibles through a game where Crown Royal provided prizes that were awarded based on the sharing of gratitude from one person to another. “Our mission is to empower brands with cutting-edge strategies to navigate the next era of the internet, keeping them at the forefront of digital innovation and consumer attention,” said Avery Akkineni, president of Vayner3.

Vaynerchuk leverages the power of popular culture by bringing people together who normally would not interact with one another, but when they do have the potential to build lasting and impactful relationships. He understands that consumer behavior is a reflection of cultural trends, and by tuning into these trends, brands can effectively resonate with and serve their audience.

The roster in VeeCon is incredibly diverse, ranging from Tim Tebow to Arianna Huffington to Busta Rhymes. Vaynerchuk’s goal is not to attract a group of people who all think the same, but rather to expose people to different ideas and have serious conversations with people they may not normally meet. “Do you know how happy I am that people try things that they haven’t tried before this conference? It was the same thing in the wine business: people find a wine they like and they keep on drinking it. I want people to think about different stuff. I have only innovated by looking at random things from a different perspective and those were the breadcrumbs for innovation,” Vaynerchuk continued.

Last year, VeeCon 2022 brought together all genres of people, ranging from pure Gary fans who bought into NFTs to marketing executives at enterprises and foundations. “As someone who oversees internal comms for a community of orgs – over 1000 employees in all – I’m curious about ways to deploy NFTs as tools within an organization to foster culture, connection, and motivation… There’s an enormous opportunity to experiment in this space – it’s pretty wide open right now, and I want to be a trailblazer. My employer both sees the opportunity and encourages me to explore it, which is why I’m here. For folks like me who are interested in comms, marketing, innovation, and the consumer blockchain, VeeCon is basically the center of the universe,” said Rob Raffety, director of internal communications at Stand Together.

The textbook approach in business often gives lip service to understanding the consumer and competitive differentiation, but the implementation of the textbook approach breaks down in practice. “I hated school because it did not allow for serendipity and let people think. It was based on memorization,” said Vaynerchuk. Instead, VeeCon 2023, and Vaynerchuk’s media enterprises more broadly, encourage mixing, matching, and engagement with pop culture even when there is not an explicit destination in mind.

Love it or hate it, consider the rise of TikTok and the subsequent explosion of short-form video content across platforms. Vaynerchuk, an early adopter, leveraged this trend in his own media company by building out short-form content, advising businesses to do the same and harness the power of evolving consumer behavior for brand success.

Innovation has been a constant theme in Vaynerchuk's career. He understands that innovation isn't just about improving products or services—it's about shaping culture and identity—and it’s evident in the way technology and business are increasingly intertwined. Vaynerchuk contends that the evolving relationship between the two is not merely transactional. Instead, it's a symbiotic relationship where advancements in technology drive business innovation, and conversely, the demands of business push technology to new frontiers.

Vaynerchuk's unique perspective on web3, pop culture, cross-industry collaboration, and innovation positions him at the forefront of a new business era. His insights, coupled with his relentless drive, pave the way for new avenues of consumer engagement and business growth. In a world where technology and culture are in constant flux, his approach and implementation of VeeCon serves as a novel NFT use-case that business can learn from to thrive.

Gary Vaynerchuk’s vision is not just about predicting the future—it's about creating it. As he navigates the new business landscape with a keen eye on consumer trends and technological advancements, Vaynerchuk exemplifies the power of embracing change, fostering innovation, and leveraging the convergence of disciplines. His journey serves as a testament to the transformative potential of a truly innovative mindset in the world of business.

Read More
Christos Makridis Christos Makridis

Airdrops are great, but be aware of the risks

This article was originally published in Cointelegraph.

Airdrops have emerged as a powerful tool for token distribution, user acquisition and community building as the blockchain industry has grown. They provide a unique opportunity for projects to distinguish themselves, incentivize desired behaviors and foster long-term relationships with their user base. But the question remains: Do airdrops work?

Based on my prior research in the Journal of Corporate Finance, the answer — at least according to the data so far — is “yes.” But my new research with Kristof Lommers and Lieven Verboven highlights that their efficacy hinges on thoughtful design, clear objectives and strategic execution.

At the heart of a successful airdrop lies the careful selection of eligibility criteria and incentives. These criteria can range from simple (like owning a specific token) to more complex (like exhibiting certain behaviors on-chain), but they should be aligned with the airdrop’s objectives. For instance, if the goal is to reward loyal users, then the eligibility criteria could include users who have held a certain token for a specific period. Similarly, if the aim is to promote a new protocol, then the criteria could be interacting with it.

Incentives, on the other hand, can take various forms — from direct token rewards to exclusive access to new features or services. The key is to strike a balance between being attractive enough to engage users and remaining economically viable for the project. For example, the Blur airdrop integrated social media activity into its eligibility criteria. Instead of just providing tokens to existing users or holders of a certain token, Blur incentivized users to share the airdrop on social media platforms and encouraged referrals among their networks to gain extra tokens. This method not only broadened the reach of its airdrop but also fostered a sense of community as users actively participated in spreading the word about Blur.

Timing also plays a crucial role. Launching an airdrop too early in a project’s lifecycle might lead to token distribution among users who lack genuine interest, while a late-stage airdrop might fail to generate the desired buzz. The optimal timing often coincides with a project’s token launch, creating initial distribution and liquidity. As prior research by Yukun Liu and Aleh Tsyvinski highlighted, momentum in the market plays a big role in explaining token prices.

However, airdrops are not without their challenges. One of the most serious risks is Sybil attacks, where malicious actors create multiple identities to claim a disproportionate share of tokens. Mitigating this risk requires a blend of strategies, including upfront whitelisting of users, raising barriers to entry and implementing Sybil attack detection mechanisms.

Especially in the past two years, projects must take into account the regulatory environment. Although nonfungible tokens (NFTs) have been largely exempt from strict regulatory enforcement action by the Securities and Exchange Commission, fungible tokens have been more in their line of sight, and the distribution of tokens coupled with an expectation of future profit could increase legal risk. Given the regulatory gray zone around tokens, projects must ensure they’re not inadvertently issuing securities. And with most large blockchain networks being public, privacy concerns may arise, potentially revealing sensitive information about airdrop recipients.

So, how much of a token supply should be allocated to an airdrop? There’s no one-size-fits-all answer. A project’s unique goals and strategies should guide this decision. However, research indicates that teams allocate 7.5% of their token supply to community airdrops on average.

One of the often-overlooked aspects of airdrops is their potential to harness the power of network effects. By incentivizing sharing, airdrops can amplify their impact, attracting more users to a project’s ecosystem and creating a self-reinforcing cycle of growth and value creation.

A final consideration to keep in mind is the simplicity of the airdrop. Convoluted eligibility criteria will confuse people — even if it is intelligently and rationally designed. An airdrop should be a straightforward and enjoyable experience for users, particularly for non-crypto natives. Collaborating with wallet providers can simplify the process for such users, making the airdrop more accessible and attractive.

A good analogy is in the context of monetary policy. When the United States Federal Reserve articulates simple policy rules about how it will deal with inflation, and then sticks to them, markets react much more positively than when it deviates from rules. The same is true with airdrops: Design them carefully, but keep them simple and transparent.

Airdrops can indeed work wonders when designed and executed well. They offer an exciting avenue for projects to stand out in the crowded blockchain landscape, encouraging user engagement and community development.

But their success is not a matter of chance — it’s a product of thoughtful design, clear objectives and strategic execution. Especially as many potential airdrops loom on the horizon with Sei Network, Sui, Aptos and more, understanding and harnessing the power of airdrops will become increasingly crucial for projects aiming to thrive in this dynamic space.

Read More
Christos Makridis Christos Makridis

Opera Streaming Revolution: Boston Baroque, IDAGIO, And GBH Music Unveil Digital Innovation For Global Audiences

This was originally published in Forbes.

On the heels of its 50 year anniversary, Boston Baroque, in partnership with IDAGIO and GBH Music, premiered an incredible series of performances on April 20th of the opera Iphigénie en Tauride by Christoph Willibald Gluck with a digital streaming and will remain available for $9 per stream until May 21st. Under the musical direction of Martin Pearlman and leadership of Jennifer Ritvo Hughes, Boston Baroque has become a leading cultural institution and innovator.

Technological innovation in classical music

Although all sectors were adversely affected by the onset of Covid-19 and the associated lockdowns, none were more affected than arts and entertainment with over a 50% decline in employment between February 15 and April 25 2020, which remained in decline well into the summer and 2021, according to research published by the Brookings Institution. The same phenomenon held across other developed countries, according to the OECD. The loss in employment led to a significant deterioration in mental health and well-being among performers and other workers in cultural institutions, according to professors Samantha Brooks and Sonny Patel in a 2022 article published in Social Sciences & Humanities Open.

However, some cultural institutions responded to these challenges with substantial innovation. Following the onset of Covid-19, Boston Baroque began working with GBH – the leading multiplatform public media producer in America – to digitally stream performances across the world. GBH Music became a production collaborator and presenter of Boston Baroque, among other celebrated music organizations, allowing for a continuation of musical performances. Even though there was an overall increase in streaming of performances, GBH Music was unique, most notably with their excellent production quality, resembling an in-person experience as much as possible.

"When GBH Music first met with Boston Baroque to explore the presentation of opera in our Calderwood Studio, we agreed that the goal was to find a new, innovative way to present these amazing works by taking advantage of the technology and talents we have at our disposal. And the results have been exceptional. The in-studio and on-line experiences bring audiences closer to the music, to the singers, and helps breathe new life into this centuries-old music. The visual and aural connection between artists and audiences is unique. Seeing musical talent, stunning production values, high-quality audio all come together to benefit opera, is thrilling,” said Anthony Rudel, General Manager of GBH Music.

Now that in-person performances have resumed, digital streaming has become a complement, rather than a substitute, for Boston Baroque, enlarging their reach and strengthening their world-renown brand as a staple cultural institution. “The arts often don’t place enough value on how people want to consume what we have to offer—we miss out on key opportunities to grow revenue and reach… In a traditional performing arts business model, the opportunity for return on investment ends when the concert downbeat begins due to a bias for in-person performance. At Boston Baroque, we’ve used digital innovation to disrupt this core business model constraint, providing unique to market, compelling content that consumers value,” said Jennifer Ritov Hughes, the Executive Director of Boston Baroque.

Economists have long pointed towards technology as the primary driver of productivity growth in society, but whether it translates into improvements in well-being and flourishing depends on whether and how society integrates technology as a complement, not substitute, to humans.

“Through partnerships with GBH, IDAGIO, and others, we’ve built a model for developing and delivering content that audiences are asking for, while paying artists for their work in the digital concert hall. A reviewer once called what we do an ‘authentically hybrid experience,’ where we simultaneously deliver in person programming while capturing digital content of a high enough quality to monetize and distribute on platforms with a global reach… One year after going digital, our market grew from 4,000 regional households to 35,000 households in the US and the UK. At the close of our 22/23 season, we now have audiences in 55 countries on 6 continents and counting. We’re just beginning to explore the potential of digital—many possibilities lie ahead,” Hughes continued.

“When I founded Boston Baroque 50 seasons ago, it was the first period-instrument orchestra in North America, and so it was quite an experiment. Everything that has come since then—being the first period-instrument orchestra to perform in Carnegie Hall, being nominated for 6 GRAMMY® Awards for our recordings, and now streaming our concerts on six continents—has been the wonderful and unexpected outcome of a simple desire to make music in a free and authentic way,” said Martin Pearlman, the Founding Music Director.

Iphigénie En Tauride

Iphigénie en Tauride is a drama of the playwright Euripides written between 414 BC and 412 BC, detailing the mythological account of the young princess who avoided death by sacrifice at the hands of her father Agamemnon thanks to the Greek goddess, Artemis, who intervened and replaced Iphigenia on the altar with a deer. Now a priestess at the temple of Artemis in Tauris, she is responsible for ritually sacrificing foreigners who come into the land. The opera revolves around her forced ritualistic responsibility on the island, coupled with an unexpected encounter with her brother, Orestes, who she had thought was dead.

Led by stage director Mo Zhou and conductor Martin Pearlman, Soula Parassidis, a Greek-Canadian dramatic soprano, played the title role of Iphigénie on April 20, 21, and 23 in Boston, accompanied by an outstanding cast of distinguished international artists, including William Burden, Jesse Blumberg, David McFerrin, and Angela Yam, among others. The performance has amassed a wide arrow of glowing reviews.

Because the same opera is replayed many times over even within the same year, stage directors bear significant responsibility to bring a new perspective each time, particularly in an era with limited attention spans. “The classical music field is going through a schismatic change right now. As a practitioner and an educator, I ask myself and my students this question everyday: “In the age of Netflix and Hulu, how can we make our art form more accessible?” We’ve been putting ourselves on a high pedestal for a long time. If we do not adapt, we will gradually lose touch with the new generation of audiences, said Mo Zhou, the Stage Director.

In contrast to the more regietheater style where the director is encouraged to diverge from the original intentions of the playwright or operatist, this production stayed true to its roots, featuring Iphigénie in a beautiful gown and highlighting the intense pain that Iphigénie felt when she was asked to continue a sacrifice to the gods and subsequent intense joy when she discovered her brother, Orestes, was alive.

“I found this process of working on Iphigénie en Tauride with Boston Baroque, GBH and IDAGIO extremely fulfilling and refreshing. I think this production has presented a feasible formula where we keep the unique experience of the “live” performance, but also capture the ephemeral moments on stage and make it available to a broader audience across the world… In addition to thinking about the character building, stage composition and visual design like a conventional stage production, I also incorporated the notion of camera angles into my pre-blocking and design process, which shows in our end result. It demands a lot of advanced planning and the clarity of your dramatic and visual intention. It’s a beautiful collaboration between myself and our livestream director, Matthew Principe,” Zhou continued.

Future of the arts and the metaverse

IDAGIO has pioneered an incredible service for classical music and the performing arts, giving thousands more people across the world access to top-tier performances. “We are offering the infrastructure to any partner who is interested in sharing media content with audiences online. Recording and producing a concert is one thing. Distributing them and making them available to committed audiences around the world is another. That’s what we enable and what we love to do,” said Till Janczukowicz, CEO and Founder of IDAGIO.

The response to opening up in-person performances to digital audiences has been overwhelming. “IDAGIO has over 50,000 reviews on the App store averaging 4,7 / 5 stars. Users and artists love IDAGIO for many reasons, also because of our fair pay-out model: we remunerate by second and per user. This is as fair as you can get in audio streaming,” continued. In many ways, digital streaming of performances is an early use-case of metaverse applications that aim to provide users with more immersive experiences and connectivity between physical and digital assets.

While we have yet to see many truly immersive and fully-fledged metaverse use-cases, there is substantial interest from consumers and metaverse companies alike, particularly for changing the way that people engage with the arts by giving artists a more experiential mechanism of performing for and connecting with their audience.

“Rapper Royce 5’9” is a great example of this. Even with a successful 20-year career under his belt, he has sought out better platforms to engage with aspiring artists and his community. With Passage, he’s creating a beautiful 3D space called the Heaven Experience to host exclusive songwriting and studio sessions, interviews with music industry veterans, live performances, and more. These types of interactions simply wouldn’t be possible on something like a Zoom call or traditional livestream,” said Caleb Applegate, CEO of Passage.

During an era of intense technological change, the arts plays a more important role than ever and technology has the potential to augment, not replace, in-person performances.

Read More
Christos Makridis Christos Makridis

A management scientist explains why personality matters as much as skill for the future of work

This article was originally published in Fast Company.

The vast majority of discussions about the future of work focus on “reskilling”—that is, equipping workers with knowledge and skills that are in demand and at the technology frontier. Ranging from the OECD (i.e., the “The Case for 21st Century Learning”) to the U.S. Office of Science Policy and Technology (i.e., “Interagency Roadmap to Support Space-Related STEM Education and Workforce”), there is bipartisan, interagency, and international support for reskilling.

To be clear, these aims are important and filled with good intentions, but focusing on skills, especially technical skills, risks overlooking what’s right in front of us: personality. There is no debate that skills matter in the workplace—an industrial engineer without the technical know-how could make an error that causes a building to become structurally unsound.

But overemphasizing skills, which are easily attainable, oversimplifies the career journey by creating a moving target for a goal post: Today it’s AI that’s in demand, but tomorrow it’s blockchain. An alternative strategy is to focus on the personality characteristics that lead people to not only acquire the requisite technical know-how, but also work well with others and persevere through trials.

My newly released research shows that personality matters at least as much as skills in explaining differences in compensation across jobs and over time. Using data from the Department of Labor that measures 16 occupational personality requirements—that is, personality constructs that can affect how well someone performs a job—we constructed two general indices that we refer to as intellectual tenacity and social adjustment.

On one hand, intellectual tenacity encompasses achievement/effort, persistence, initiative, analytical thinking, innovation, and independence. For simplicity, let’s call this attribute persistence. On the other hand, social adjustment encompasses emotion regulation, concern for others, social orientation, cooperation, and stress tolerance. These span the spectrum of personality traits and their construction is anchored in a mountain of research from the psychology literature.

We subsequently linked these data on personality requirements across occupations with data on over 10 million individuals between 2006 and 2019 to study how differences in personality requirements are valued across occupations. Crucially, we found that individuals working in occupations that rank higher in persistence earn substantially more than their counterparts, and that the economic return—measured through annual earnings—was increasing over time. We did not, however, find similar effects for individuals working in occupations that rank higher in social adjustment, although occupations that rank high in both persistence and social adjustment earn the most.

One concern is that we are not comparing apples to apples. People who work in occupations ranking higher in persistence differ in other ways from those who work in occupations ranking higher in social adjustment. While that is true, we control for a wide array of demographic factors, including age, education, race, gender, and family size. Furthermore, we isolate comparisons among people in similar industries and broad occupations to ensure more reasonable comparisons, ensuring that we are not comparing, for example, CEOs with cashiers.

Another concern is that occupations that rank higher in persistence also rank higher in their skill requirements, so our focus on personality is simply a mask for skills. However, we also control for differences in cognitive skill requirements across occupations and continue finding a strong positive association between persistence and annual earnings. And the link between persistence and earnings is roughly as large as the association with cognitive skills. The relative and growing importance of persistence is especially striking given a slowdown in the returns to cognitive skills.

What do these results mean for policymakers and managers?

Strengthen Persistence

First, increasing persistence is a promising avenue for workforce development and education to help workers become “future proof” in the emerging digital economy. Economic and labor market outcomes depend on the capacity of individuals to learn and adapt in the face of automation and artificial intelligence.

While personality is commonly misperceived as fixed, it continues to evolve throughout the lifespan. Interventions aimed at developing the mindsets, skill sets, and contexts that encourage persistence are timely targets for education reform and workforce development, which may have the greatest impact in the early stages of childhood development.

In fact, my recent book with Goldy Brown III investigates a wide array of best practices for strengthening persistence in early childhood development. For example, after school programs that allow children to practice skills outside of the classroom and others can be effective in cultivating good habits and keeping children, especially those at risk in low-income neighborhoods, out of otherwise dangerous situations. Similarly, my recent handbook chapter also highlights the role of music education in early childhood in building the habit of persistence and cognitive skills.

Second, persistence is an important developmental target for everyone—not just skilled workers or the more educated. The effect of occupational persistence requirements on earnings was consistent among both college graduates and individuals without college degrees. Industrial-worker jobs are still valued in the economy as long as they require persistence. That means organizations, even those that require less skilled workers, should be mindful of inculcating a culture of continuous learning and improvement independent of the degree of digital intensity of the tasks.

Formally Assess Personality

Third, in addition to assessing relevant skills when hiring, organizations may also find it useful to conduct formal assessments of personality. While that insight is not new, and indeed The Gallup Organization (among others) has developed a sophisticated assessment, personality is likely to play an increasingly important role in technology organizations, especially as more work is done remotely and the need for clear and cohesive communication grows.

There has been much discussion and debate about how to reskill the labor force. That discussion is good and important, but organizations and policymakers should not make technical skills the priority at the expense of the underlying personality traits that sustain life-long learning and resilience to trials and adversity. We always knew personality mattered. Now we finally have robust quantitative evidence on exactly what dimensions matter most for career progression.

Christos A. Makridis is a research affiliate at Stanford University, among other institutions, and holds doctorates in economics and management science and engineering from Stanford University.

Read More
Christos Makridis Christos Makridis

Beyond the TikTok mess, creators have options to protect data and privacy

This article was originally published in Dallas Morning News with Soula Parassidis.

The recent congressional hearings around a potential ban of TikTok in the United States have revitalized fundamental questions about both the positive and negative effects of social media on people and the economy. Setting aside the legitimate national security considerations of TikTok and the way that China leverages data from the platform, we need to have a broader discussion about digital platforms, their effects on creators, and the way forward.

On one hand, social media platforms have provided creators with tools to build communities and sell desirable goods and services. Marketing has traditionally required large budgets that go toward paid advertisements. Today, creators can speak directly to potential consumers and fans without having to engage expensive press relations companies.

On the other hand, these platforms hold monopoly power with few checks and balances and produce an array of adverse side effects on mental health and even physical safety. For example, a platform might change its algorithm, or even engage in censoring, and catch a creator completely off guard. Further, malicious users — including human traffickers — are known to exploit these platforms (e.g., Meta, Instagram, TikTok) by using them to solicit children.

Advocates of the digital platform incumbents often engage in binary thinking — that is, society either has to live with the big tech monopoly as it currently exists, or we lose an open and free internet and creators are out of a job forever. The reality is that there is a large middle ground, ranging from policy reforms that introduce additional structure to new technological tools that provide alternative ways for creators to reach their fans.

Let’s first consider policy reforms. Section 230 of the Communications Decency Act was written in a completely different era — one without the internet, and roughly a decade before any trace of social media. Sadly, Section 230 has provided what is known as a liability shield around big tech companies, allowing them to abdicate responsibility even in the presence of demonstrated harm by claiming that they are a publisher with editorial discretion. But simultaneously, these companies claim that they are creating an open and free internet that promotes well-being.

You cannot have it both ways.

The result is an unpredictable technological and business landscape where creators can have their content censored or even inadvertently overlooked because of algorithmic changes. For example, Instagram’s recent announcement to sell verification as a service now devalues the verification that notable figures previously had. Furthermore, paying customers receive more visibility on their content, compared to those figures who already had verification. That volatility for creators counters the very argument that advocates use to defend big tech platforms.

But fortunately, distributed ledger technologies, or DLTs for short, are offering creators new ways to reach their fans while maintaining sovereignty over their data and more privacy. Even though current social media platforms come across as “free,” they are not — users implicitly forgo the value of their data, which is what the digital platforms effectively securitize and sell to advertisers for targeted ads. In other words, the user is the customer!

Our recently published research in the research journal Frontiers in Blockchain Economics points out use-cases of distributed ledger technologies in the creator economy, most notably through the use of non-fungible tokens, which allow users to publicly signal and record on an immutable blockchain their created content. NFTs provide proof of ownership using unique identifiers and metadata stored on the blockchain that can be publicly accessed and verified over the entire spectrum of content, ranging from a deed on a house to a digital drawing to a classical music recording. They also allow for the secure transfer of ownership on secondary marketplaces and royalties, reducing the dependence on intermediaries and scope for dispute and expensive legal fees for content creators. There is much more still to experiment with and build, but the infrastructure and possibilities already exist.

We continue to investigate, experiment and figure out emerging online tools, especially how to resolve conflicting NFTs that exist on different blockchains. This is not a new problem — countries have resolved international disputes for years, and the beauty of the blockchain is that it has the potential to confer greater transparency and clarity to facilitate the resolution of these disputes. Whether policy reforms happen in the short or medium run, creators can confidently begin working with these tools and begin empowering themselves.

Christos Makridis is chief operating officer and Soula Parassidis is CEO at LivingOpera.org. They wrote this column for The Dallas Morning News.

Read More
Christos Makridis Christos Makridis

In global technology race, Virginia holds the key

This article was originally published in Richmond Times-Dispatch with Joshua de Salis Soprhin.

The recent congressional hearings over TikTok underscore the growing bipartisan concerns about the Chinese Communist Party (CCP) and how the United States’ technological exposure to Chinese interests could impair national security. But national security is also impacted by economic competitiveness, particularly in strategic industries that have dual-use applications in both the private and public sectors. If the U.S. is to maintain its competitive edge against the CCP and stand as a source for freedom in the world, it needs a compelling industrial policy.

Fortunately, there is growing bipartisanship behind certain elements of industrial policy, which led to the passage of the CHIPS and Science Act in 2022 that was designed to help promote the U.S. semiconductor industry and its competitiveness across the globe, especially with China. However, more work is needed to develop regional partnerships and clusters between innovators in the private sector and the national security funding and technology apparatus.

Semiconductors are the fourth-largest U.S. export, and the industry directly employs 300,000 Americans and touches an additional 1.6 million, according to a 2022 report by the President’s Council of Advisors on Science and Technology. Furthermore, the global semiconductor industry grew by more than 300% from $139 billion in sales in 2001 to $573.5 billion in 2022, and unit sales of semiconductors grew by over 290%. Demand grew substantially during the height of the COVID-19 pandemic due to the increase in consumption of digital goods and services. However, the sector struggled to meet demand due to fragmented and thin supply chains. Although Taiwan and South Korea are both large producers of semiconductors, China committed $150 billion over a 10-year period to the sector and has been aggressively making inroads to monopolize the industry by 2030.

In 2023, Virginia will receive over $8 billion in infrastructure investments to build on that technological cutting edge. This investment will include $6.22 million for increased access to broadband internet in rural areas to provide everyone with the tools they need to succeed. In addition, Virginia will receive $106 million to create a new electric vehicle charging grid, in addition to other investments in smart infrastructure. All Virginia residents will gain access to new opportunities through these programs even if they do not work in tech directly.

Clearly our adversaries are aware of the sector’s strategic importance, and they have publicly proclaimed it. While there are many tools at our disposal to promote innovation, the U.S. must choose carefully how to respond to risk over escalation and a spiral of reactionary policies.

Every state has a role to play in confronting this challenge, but perhaps the most obvious is Virginia because of its proximity to the national security community in Washington, D.C. Virginia can contribute to safeguarding the U.S. semiconductor industry and its role in ensuring our national security by mapping its economic activities to similar efforts and applying CHIPS Act federal funds toward initiatives that drive the further development of capabilities in the design and manufacturing of chips. However, further investments — and not just financial — are needed.

The national security community has long acknowledged the challenge of engaging and supporting smaller R&D groups that have high-impact, but operate independently. Despite the best efforts, these groups still struggle and are not able to easily tap into federal awards for emerging technologies that traditionally go to the more established incumbents. To remain innovative and competitive, the federal government will need to support these smaller organizations and overcome the “valley of death.”

1. Build a protected sandbox where only small businesses can initially participate — and not simply as subcontractors to larger groups that have already received an award — with further refinements to the system for the award management (SAM) registration process. We must build up confidence and provide trust by incentivizing the Small Business Administration and allied federal agencies to move more quickly and efficiently.

2. Eliminate the practice of government agencies using requests for information and requests for proposals as purely a market-intelligence exercise. Using these as mechanisms to survey cutting-edge technologies not only wastes the time of entrepreneurs and small business owners, but also undermines trust when the perception of having a chance at winning an award is yanked out from under them. Leveraging local proximity and innovation clusters in Virginia to cultivate relationships, rather than relying on highly formal and impersonal communication patterns, would motivate entrepreneurs.

3. Establish a better contracting infrastructure that focuses on continuous development and delivery and avoids generational gaps over long-term investment cycles in how technology innovation is made available to the government. Too often, one contractor may provide a generation 1 technology and then separate software is released that needs generation 4 technology, but there is no support for the government to run the new software and it translates into a multimodal failure in implementation. This is especially relevant in production of semiconductors where there is a longer timeline for development of both the hardware manufacturing facilities and machining capabilities, which are in turn tightly integrated into the contemporary or future design of chips as part of hardware/software co-design methodologies.

As we navigate the intricacies of our contemporary foreign policy with the CCP and its impact on our nation’s ability to compete globally, states like Virginia hold the essential physical and human capital resources to strengthen the U.S. industrial base and catalyze a resurgence of domestic manufacturing capabilities. While the CHIPS Act is an important start, implementing and maintaining a long-term strategic effort will require further educating state, federal and local leaders to have a more holistic and better understanding of the disparate array of economic, technical inputs that need to be continuously balanced in order to ensure the national security position of the U.S. well into the future.

Dr. Christos A. Makridis is a professor, entrepreneur and adviser, and holds doctorate degrees in economics and management science and engineering from Stanford University. Contact him at cmakridi@stanford.edu.

Joshua De Salis Sophrin is the owner of Delaware-based JDSS, a family business that invests in technology and infrastructure. Contact him at information@assoc.jdss.com.

Read More
Christos Makridis Christos Makridis

Crypto security audits and bug bounties are broken: Here’s how to fix them

This article was originally published in Cointelegraph Magazine.

Blockchain exploits can be extremely costly; with poorly designed smart contracts, decentralized apps and bridges are attacked time and time again.

For example, the Ronin Network experienced a $625-million breach in March 2022 when a hacker was able to steal private keys to generate fake withdrawals and transferred hundreds of millions out. The Nomad Bridge later that year in August experienced a $190-million breach when hackers exploited a bug in the protocol that allowed them to withdraw more funds than they had deposited.

These vulnerabilities in the underlying smart contract code, coupled with human error and lapses of judgment, create significant risks for Web3 users. But how can crypto projects take proactive steps to identify the issues before they happen?

There are a couple of major strategies. Web3 projects typically hire companies to audit their smart contract code and review the project to provide a stamp of approval.

Another approach, which is often used in conjunction, is to establish a bug bounty program that provides incentives for benign hackers to use their skills to identify vulnerabilities before malicious hackers do.

There are major issues with both approaches as they currently stand. 

Web3 auditing is broken

Audits, or external evaluations, tend to emerge in markets where risk can rapidly scale and create systemic harm. Whether a publicly traded company, sovereign debt or a smart contract, a single vulnerability can wreak havoc.

But sadly, many audits – even when done by an external organization – are neither credible nor effective because the auditors are not truly independent. That is, their incentives might be aligned toward satisfying the client over delivering bad news.

“Security audits are time-consuming, expensive and, at best, result in an outcome that everything is fine. At worst, they can cause a project to reconsider its entire design, delaying the launch and market success. DeFi project managers are thus tempted to find another, more amenable auditing company that will sweep any concerns under the carpet and rubber-stamp the smart contracts,” explains Keir Finlow-Bates, a blockchain researcher and Solidity developer.

“I have had first-hand experience with this pressure from clients: arguing with developers and project managers that their code or architecture is not up to scratch receives push-back, even when the weaknesses in the system are readily apparent.”

Principled behavior pays off in the long run, but in the short term, it can come at the cost of profitable clients who are eager to get to market with their new tokens. 

“I can’t help noticing that lax auditing companies quickly build up a more significant presence in the auditing market due to their extensive roster of satisfied customers… satisfied, that is, until a hack occurs,” Finlow-Bates continues.

One of the leading companies in Web3 auditing, CertiK, provides “trust scores” to projects that they evaluate. However, critics point out they have given a stamp of approval to projects that failed spectacularly. For example, while CertiK was quick to share on Jan. 4, 2022, that a rug pull had occurred on the BNB Smart Chain project Arbix, they “omitted that they had issued an audit to Arbix 46 days earlier,” according to Eloisa Marchesoni, a tokenomics specialist, on Medium. 

But the most notable incident was CertiK’s full-scope audit of Terra, which later collapsed and brought half the crypto industry down with it. The audit has since been taken down as they have taken a more reflective approach, but bits and pieces remain online. 

Terra-fied

Zhong Shao, co-founder of CertiK, said in a 2019 press release:

“CertiK was highly impressed by Terra’s clever and highly effective design of economy theory, especially the proper decoupling of controls for currency stabilization and predictable economic growth.”

He added, “CertiK also found Terra’s technical implementation to be of one of the highest qualities it has seen, demonstrating extremely principled engineering practices, mastery command of Cosmos SDK, as well as complete and informative documentations.”

This certification played a major role in Terra’s increased international recognition and receipt of investment. The recently arrested Do Kwon, co-founder of Terra, said at the time:

“We are pleased to receive a formal stamp of approval from CertiK, who is known within the industry for setting a very high bar for security and reliability. The thorough audit results shared by CertiK’s team of experienced economists and engineers give us more confidence in our protocol, and we are excited to quickly roll out our first payment dApp with eCommerce partners in the coming weeks.”

For its part, CertiK argues its audits were comprehensive and the collapse of Terra was not down to a critical security flaw but human behavior. Hugh Brooks, director of security operations at CertiK, tells Magazine:

“Our Terra audit did not come up with any findings that would be considered critical or major because critical security bugs that could lead a malicious actor to attacking the protocol were not found. Nor did this happen in the Terra incident saga.”

“Audits and code reviews or formal verification can’t prevent actions by individuals with control or whale’s dumping tokens, which caused the first depeg and subsequent panicked actions.”

Giving a stamp of approval for something that later turned out to be dodgy is not confined to the blockchain industry and has repeated itself throughout history, ranging from top five public accounting firm Arthur Anderson giving the nod to Enron’s books (later destroying parts of the evidence) to rating agency Moody’s paying out $864 million for its dodgy optimistic bond ratings that fueled the housing bubble of 2008–2009 and contributed to the Global Financial Crisis.

So, it’s more that Web3 audit companies face similar pressures in a much newer, faster-growing and less regulated industry. (In the past week, CertiK released its new “Security Scores” for 10,000 projects — see right for details).

The point here is not to throw CertiK under the bus – it is staffed with well-intentioned and skilled workers – but rather that Web3 audits don’t look at all of the risks to projects and users and that the market may need structural reforms to align incentives.

“Audits only check the validity of a contract, but much of the risk is in the logic of the protocol design. Many exploits are not from broken contracts, but require review of the tokenomics, integration and red-teaming,” says Eric Waisanen, tokenomics lead at Phi Labs.

“While audits are generally very helpful to have, they are unlikely to catch 100% of issues,” says Jay Jog, co-founder of Sei Networks. “The core responsibility is still on developers to employ good development practices to ensure strong security.”

Stylianos Kampakis, CEO of Tesseract Academy and tokenomics expert, says projects should hire multiple auditors to ensure the best possible review.

“I think they probably do a good job overall, but I’ve heard many horror stories of audits that missed significant bugs,” he tells Cointelegraph. “So, it’s not only down to the firm but also the actual people involved in the audit. That’s why I wouldn’t ever personally trust the security of a protocol to a single auditor.” 

zkSync agrees on the need for multiple auditors and tells Magazine that before it launched its EVM compatible zero knowledge proof rollup Era on mainnet on March 24, it was thoroughly tested in seven different audits from Secure3, OpenZeppelin, Halburn and a fourth auditor yet to be announced.

White hat hackers and bug bounties

Rainer Böhme, professor for security and privacy at the University of Innsbruck, wrote that basic audits are “hardly ever useful, and in general, the thoroughness of security audits needs to be carefully tailored to the situation.” 

Instead, bug bounty programs can provide better incentives. “Bug bounties offer an established way to reward those who find bugs… they would be a natural fit for cryptocurrencies, given they have a built-in payment mechanism,” Böhme continued.

White hat hackers are those who leverage their talents to identify a vulnerability and work with projects to fix them before a malicious (“black hat”) hacker can exploit it. 

Bug bounty programs have become essential to discovering security threats across the web, generally curated by project owners who want talented programmers to vet and review their code for vulnerabilities. Projects reward hackers for identifying new vulnerabilities and upkeep and integrity maintenance on a network. Historically, fixes for open-source smart contract languages — e.g., Solidity — have been identified and fixed thanks to bug bounty hackers.

“These campaigns began in the ‘90s: there was a vibrant community around the Netscape browser that worked for free or for pennies to fix bugs that were gradually appearing during development,” wrote Marchesoni.

“It soon became clear that such work could not be done in idle time or as a hobby. Companies benefited twice from bug bounty campaigns: in addition to the obvious security issues, the perception of their commitment to security also came by.”

Bug bounty programs have emerged across the Web3 ecosystem. For example, Polygon launched a $2-million bug bounty program in 2021 to root out and eliminate potential security flaws in the audited network. Avalanche Labs operates its own bug bounty program, which launched in 2021, via the HackenProof bug bounty platform.

However, there is tension between the extent of the security gaps they believe they have found and how significantly the issue is taken by projects. 

White hat hackers have accused various blockchain projects of gaslighting community members, as well as withholding bug-bounty compensation for white hat services. While it goes without saying, actually following through with the payment of rewards for legitimate service is essential to maintain incentives.

A team of hackers recently claimed that it was not compensated for its bug bounty services to the Tendermint application layer and Avalanche.

On the other side of the fence, projects have found some white hat hackers are really black hats in disguise.

Tendermint, Avalanche and more

Tendermint is a tool for developers to focus on higher-level application development without having to deal directly with the underlying communication and cryptography. Tendermint Core is the engine that facilitates the P2P network via proof-of-stake (PoS) consensus. The Application BlockChain Interface (ABCI) is the tool with which public blockchains link to the Tendermint Core protocol.

In 2018, a bug bounty program for the Tendermint and Cosmos communities was created. The program was designed to reward community members for discovering vulnerabilities with rewards based on factors such as “impact, risk, likelihood of exploitation, and report quality.” 

Last month, a team of researchers claimed to have found a major Tendermint security exploit, resulting in a services crash via remote API – a Remote Procedure Call (RPC) Tendermint vulnerability was discovered, impacting over 70 blockchains. The exploit would have a severe impact and could potentially include over 100 peer-to-peer and API vulnerabilities since the blockchains share similar code. Ten blockchains in the top 100 of CertiK’s “Security Leaderboard” are based on Tendermint.

However, after going through the proper channels to claim the bounty, the hacker group said it was not compensated. Instead, what followed was a string of back-and-forth events, which some claim was a stalling attempt for Tendermint Core, while it quickly patched the exploit without paying the bounty hunter their dues. 

This, among others that the group has supposedly documented, is known as a zero-day exploit.

“The specific Tendermint denial-of-service (DoS) attack is another unique blockchain attack vector, and its implications aren’t yet fully clear, but we will be evaluating this potential vulnerability going forward, encouraging patches and discussing with current customers who may be vulnerable,” said CertiK’s Brooks.

He said the job of security testing was never finished. “Many see audits or bug bounties as a one-and-done scenario, but really, security testing needs to be ongoing in Web3 the same way it is in other traditional areas,” he says. 

Are they even white hats?

Bug bounties that rely on white hats are far from perfect, given how easy it is for black hats to put on a disguise. Ad hoc arrangements for the return of funds are a particularly problematic approach.

“Bug bounties in the DeFi space have a severe problem, as over the years, various protocols have allowed black hat hackers to turn ‘white hat’ if they return some or most of the money,” says Finlow-Bates.

“Extract a nine-figure sum, and you may end up with tens of millions of dollars in profit without any repercussions.” 

The Mango Markets hack in October 2022 is a perfect example, with a $116-million exploit and only $65 million returned and the rest taken as a so-called “bounty.” The legality of this is an open question, with the hacker responsible charged over the incident, which some have likened more to extortion than a legitimate “bounty.”

The Wormhole Bridge was similarly hacked for $325 million of crypto, with a $10-million bounty offered in a white hat-style agreement. However, this was not large enough to attract the hacker to execute the agreement.

“Compare this to true white hat hackers and bug bounty programs, where a strict set of rules are in place, full documentation must be provided, and the legal language is threatening, then failure to follow the directions to the letter (even inadvertently) may result in legal action,” Finlow-Bates elaborates. 

Organizations that enlist the support of white hats must realize that not all of them are equally altruistic – some blur the lines between white and black hat activities, so building in accountability and having clear instructions and rewards that are executed matter. 

“Both bug bounties and audits are less profitable than exploits,” Waisanen continues, remarking that attracting white hat hackers in good faith is not easy.

Where do we go from here?

Security audits are not always helpful and depend crucially on their degree of thoroughness and independence. Bug bounties can work, but equally, the white hat might just get greedy and keep the funds. 

Are both strategies just a way of outsourcing responsibility and avoiding responsibility for good security practices? Crypto projects may be better off learning how to do things the right way in the first place, argues Maurício Magaldi, global strategy director for 11:FS.

“Web3 BUIDLers are generally unfamiliar with enterprise-grade software development practices, which puts a number of them at risk, even if they have bug bounty programs and code audits,” he says. 

“Relying on code audit to highlight issues in your application that aims to handle millions in transactions is a clear outsourcing of responsibility, and that is not an enterprise practice. The same is true for bug bounty programs. If you outsource your code security to external parties, even if you provide enough monetary incentive, you’re giving away responsibility and power to parties whose incentives might be out of reach. This is not what decentralization is about,” said Magaldi.

An alternative approach is to follow the process of the Ethereum Merge. 

“Maybe because of the DAO hack back in the early days of Ethereum, now every single change is meticulously planned and executed, which gives the whole ecosystem a lot more confidence about the infrastructure. DApp developers could steal a page or two from that book to move the industry forward,” Magaldi says.

Five lessons for cybersecurity in crypto

Let’s take stock. Here are five broad philosophical lessons we can take away.

First, we need more transparency around the successes and failures of Web3 cybersecurity. There is, unfortunately, a dark subculture that rarely sees the light of day since the audit industry often operates without transparency. This can be countered by people talking – from a constructive point of view – about what works and what does not work. 

When Arthur Anderson failed to correct and flag fraudulent behavior by Enron, it suffered a major reputational and regulatory blow. If the Web3 community cannot at least meet those standards, its ideals are disingenuous.

Second, Web3 projects must be committed to honoring their bug bounty programs if they want the broader community to obtain legitimacy in the world and reach consumers at scale. Bug bounty programs have been highly effective in the Web1 and Web2 landscapes for software, but they require credible commitments by projects to pay the white hat hackers.

Third, we need genuine collaborations among developers, researchers, consultancies and institutions. While profit motives may influence how much certain entities work together, there has to be a shared set of principles that unite the Web3 community – at least around decentralization and security – and lead to meaningful collaborations.

There are already many examples; tools like Ethpector are illustrative because they showcase how researchers can help provide not only careful analysis but also practical tools for blockchains.

Fourth, regulators should work with, rather than against or independently of, developers and entrepreneurs.

“Regulators should provide a set of guiding principles, which would need to be accounted for by developers of DeFi interfaces. Regulators need to think of ways to reward developers of good interfaces and punish designers of poor interfaces, which can be subject to hacking and expose the underlying DeFi services to costly attacks,” says Agostino Capponi, director of the Columbia Center for Digital Finance and Technologies.

By working collaboratively, regulators are not burdened by having to be subject matter experts on every emerging technology – they can outsource that to the Web3 community and play to their strengths, which is building scalable processes.

Fifth, and most controversially, DeFi projects should work toward a middle-ground where users go through some level of KYC/AML verification to ensure that malicious actors are not leveraging Web3 infrastructure for harmful purposes.

Although the DeFi community has always opposed these requirements, there can be a middle ground: Every community requires some degree of structure, and there should be a process for ensuring that unambiguously malicious users are not exploiting DeFi platforms.

Decentralization is valuable in finance. As we have seen once again with the collapse of the Silicon Valley Bank, centralized institutions are vulnerable, and failures create large ripple effects for society. 

My research in the Journal of Corporate Finance also highlights how DeFi is recognized as having greater security benefits: Following a well-known data breach on the centralized exchange KuCoin, for example, transactions grew 14% more on decentralized exchanges, relative to centralized exchanges. But more work remains to be done for DeFi to be accessible.

Ultimately, building a thriving ecosystem and market for cybersecurity in the Web3 community is going to require good-faith efforts from every stakeholder. 

Read More
Christos Makridis Christos Makridis

Eroding America’s Cultural Capital

This article was originally published in City Journal.

For people the world over, the United States remains the target destination. People flock to the U.S. because it has offered freedom, stable institutions, and property rights, with few barriers to individual achievement. In turn, high-skilled immigration has strengthened economic and social development in the U.S. But now the Department of Homeland Security proposes fee increases for visas, especially O-1B visas, reserved for artists, athletes, and performers of extraordinary ability. DHS’s move is misguided.

Recent research by the University of Pennsylvania’s Britta Glennon demonstrates that U.S. restrictions on H-1B visas (for foreign workers in specialty occupations) have caused multinational firms to increase their employment of these skilled workers at their foreign affiliates and to open new establishments to employ them, instead of deploying them in the American market. The U.S. has experienced so much offshoring partly because of such restrictions.

No data are publicly available on O-1B visas, but we do know a few things. First, arts institutions will struggle to pay fee increases. Though philanthropic donations in general continue to grow, the arts and humanities make up one of the smallest categories of giving, according to the Arts Consulting Group. The mandated closure of theaters for two years during Covid-19 devastated the performing arts, and nearly all institutions are still trying to recover. A fee hike now would further stifle these efforts.

Second, even when arts institutions get funding, that money rarely flows to individual artists. Using millions of observations from the American Community Survey between 2006 and 2021, my research shows that real wages among artists, relative to non-artists, have been declining since 2006. Controlling for factors including age, race, gender, and education, the chart below shows that artists earned nearly 30 percent less than their non-artist peers in 2021. Arts institutions are likely to pass on the costs of higher visa fees to artists in the form of lower wages.

Third, the vast majority of talented performing artists are located in Europe, because that is where demand for their services is greatest. Raising application fees for O-1B visas would further reduce the probability that U.S. theaters will attract the top talent. The effects will accrue over time and degrade the American cultural environment.

True, the U.S. Citizenship and Immigration Services (USCIS) needs additional funding. Currently, nearly all its revenue comes from application fees. But it isn’t clear why that should be the case. If USCIS truly is the first line of defense for the nation when it comes to immigration, then the federal government should find another way to fund it. The easiest option would be through the National Defense Authorization Act (NDAA), followed by further appropriations through the homeland security subcommittee. After all, the aims of USCIS are intimately tied with national security: deciding who gets into the United States constitutes our first line of defense.

The current dependence on application fees for revenue creates several major challenges. The USCIS has appeared underfunded even in the best of times, and if anything, the agency is struggling even more now. As the Manhattan Institute’s Daniel Di Martino has suggested, providing expanded premium processing options for applicants in exchange for higher fees would boost revenues and expedite visa processing—at least in the short term. But as long as USCIS funding is fee-based, the agency will have less accountability to Congress and taxpayers. Lawmakers should be asking hard questions about how the agency can improve its operations.

USCIS’s reliance on application fees also creates perverse incentives. Bureaucratic rules force businesses to apply for H-1B visas rather than visas they would prefer to sponsor, such as green cards. The process for receiving a permanent-residence visa usually takes two years, and few applicants make it to the end. Thus businesses opt for temporary work visas, such as H-1Bs. Few candidates or businesses can wait two years or more to migrate and begin working in the U.S.

And since the H-1B lottery is random, many large firms sponsor more migrants than they need in hopes of gaming the system. These factors have the unintended consequence of causing the H-1B visa program to subsidize other areas of the immigration process. Since USCIS is chronically underfunded, this distortion is tolerated.

Putting the USCIS budget under the purview of the NDAA and providing further appropriations through the homeland security subcommittee would constitute a more sustainable strategy than another increase in visa fees.

Read More
Christos Makridis Christos Makridis

On Maria Callas Centennial, Stage Is Set For Economic Development In Greece

This article was originally published in Forbes.

Despite socio-political turmoil in Greece, Athens premiered a new rendition of Christoph Williabald Gluck’s masterpiece, Iphigénie en Tauride, at the Pallas Theater on March 18th. The debut comes on the centennial of Maria Callas, a cultural icon and Greek soprano who performed the role in 1957 at La Scala in Milan.

The anniversary of Maria Callas comes at an especially important time when many eyes are on the recent advances of AI-driven tools, like ChatGPT, reminding people that there is no substitute for carefully-executed, engaging, live performances. While opera is often viewed as the highest form of luxury entertainment, such artistic expression, as demonstrated by the legacy of Callas, has the potential to galvanize support across many different spheres of influence. One hundred years after her birth, the life of Maria Callas continues to touch sectors outside of opera, including: fashion, art, business, and tourism within and outside of Greece.

The Life Of Maria Callas, La Divina

Callas — also known as “La Divina” or “The Divine One” — was best known for her unique vocal ability and realistic characterizations. “Maria Callas’ contribution to the opera world was revolutionary, not just because of her vocal perfection, but because her performative precision and passion made her recitals historical,” said Lina Mendoni, the culture minister for Greece.

Callas’ centennial birth anniversary will be included on UNESCO’s celebratory list of anniversaries for 2023, which only concerns “personalities of genuinely universal stature, nominated posthumously only” and “must be indisputably known outside the borders of their own country, in order to reflect the ideals, values, cultural diversity and universality of the organization,” according to UNESCO.

Maria Callas had a challenging upbringing. In addition to a difficult family life with a mother who was often at odds emotionally and verbally with her, Callas moved from New York City to Athens at the age of 13, experiencing poverty, personal humiliation, and, during the World War II years, even threats to her life, according to Paul Wink in his book Prima Donna: The Psychology of Maria Callas. “Poverty and conflicted relations at home with her mother and sister failed to compensate Callas for hostility at work. A significant gain in weight further undermined her self-confidence,” said to Wink.

And yet, Callas demonstrated significant resilience amid difficult economic, social, and family conditions. “Although conceived in Greece, Maria Callas, was born in New York City, and returned to Greece in adolescence as a social and cultural outsider. Her phenomenal rise to stardom was the result of an unwavering belief in her talent, independence in judgment, and a sense of destiny. These characteristics allowed her to realize an artistic vision unimpeded by opinions of others including the press or social media in today's parlance,” said Wink.

Iphigénie En Tauride

Iphigénie en Tauride is a drama of the playwright Euripides written between 414 BC and 412 BC. The opera details the mythological account of the young princess Iphigenia who avoided death by sacrifice at the hands of her father Agamemnon thanks to the Greek goddess Artemis (Diana) who intervened and replaced Iphigenia on the altar with a deer. Now a priestess at the temple of Artemis in Tauris, she is responsible for ritually sacrificing foreigners who come into the land. The opera revolves around her forced ritualistic responsibility on the island, coupled with an unexpected encounter with her brother, Orestes, who she had thought was dead.

Led by stage director Thanos Papakonstantinou and conductor George Petrou, Soula Parassidis, a Greek-Canadian dramatic soprano, played the title role of Iphigénie on March 18th and 19th in Athens, accompanied by an outstanding cast of distinguished international artists, including tenor Juan Franciso Gatell, and baritone Philippe-Nicolas Martin, among others.

“Iphigénie en Tauride is, together with “Medea,” one of the two main roles inspired by Ancient Drama that Callas sung with great success at La Scala in Milan in 1957... moreover, for me, this is Gluck's foremost dramatic piece with wonderful and touching music,” said Olivier Descotes, director of the Olympia (Maria Callas) Theater.

“My theatre work has always been about finding a way to contemporize Ancient Greek myths. With Iphigenie, my process began with the protagonist as a doppelgänger of the goddess Artemis (Diane in Gluck’s opera). Inspired by the rituals of the “cult” of Artemis, a parallel narrative about the rites of passage emerged, encompassed by the brutality of life, which is represented by the Scythians in the opera. If we were to reduce the pathos of the opera into a single sentence, it would be that while the storms of life are inevitable, the transformation they bring to our journey can result in something beautiful if we can withstand trial and tragedy. I would be very happy if someone watched our performance and this sense was conveyed to them,” said Thanos Papakonstantinou, the stage director.

Callas emphasized the dramatic intent of her characters by rooting her interpretations within the scope of the music, recalling remarks from Tullio Serafin, a renowned Italian conductor: “When one wants to find a gesture, when you want to find how to act onstage, all you have to do is listen to the music. The composer has already seen to that. If you take the trouble to really listen with your Soul and with your Ears—and I say ‘Soul’ and ‘Ears’ because the Mind must work, but not too much also—you will find every gesture there.”

“Maria Callas is one of the most iconic figures in the history of the performing arts... she managed to convince 20th century audiences that opera, drama, and music go hand-in-hand — that is exactly what Iphigénie en Tauride is about: the perfect combination of music and drama, one serving the other as equals,” said conductor George Petrou.

“For any soprano, particularly any Greek soprano, the immense weight and shadow of Callas is always equally looming over one’s shoulder, while simultaneously inspiring one’s creative soul. It takes a certain level of courage to interpret any role made famous by Callas, let alone on her centennial. For my part, I can only bring my unique gifts and talents to the work and be satisfied by them, no matter my own limitations as an artist. The work of fully embodying a character is never completely done even when we’ve reached opening night - there is always another vocal color to employ, another gesture, another spark of inspiration to imbue. I think La Divina would agree that this unattainable perfection is part of what makes being a live performer so thrilling,” said Soula Parassidis, an international soprano and entrepreneur, who sang the lead role of Iphigénie.

Learning From La Divina: A Catalyst For Economic Revival

“Arts and cultural economic activity accounted for 4.4 percent of gross domestic product (GDP), or $1.02 trillion, in 2021,” according to the U.S. Bureau of Economic Analysis. Cultural and creative industries in Europe account for a strikingly similar share of GDP in Europe too, according to the European Association of Communication Agencies. “This sector, an economic heavyweight, which is at the heart of Europe’s social fabric, could become the number one ally of an economic revival. It showcases the power of culture, its dynamism and its contribution to the EU’s global influence,” said Marc Lhermitte, a partner at Ernst & Young.

These estimates, however, underestimate the social contributions of the arts that are not priced in the market directly, often referred to as “positive externalities” by economists. It may not come as a surprise, therefore, that the areas with greater degrees of social capital — that is, trust, networks, and norms — are those that had less of a decline in their arts and culture sectors over COVID-19.

The effects of the arts are also visible at a microeconomic level: they provide a platform for expression, enjoyment, and creativity. “Art induces inspiration, which in turn facilitates performance on creative tasks,” according to professors Donghwy An and Nara Youn in a 2018 study published in the Journal of Business Research. “Individuals with higher openness toward aesthetics were inspired more frequently and deeply in their daily lives and showed greater creativity in an idea-generation task and Remote Associates Test (RAT) scores,” they continued.

However, the arts only exist if there are creative and persevering artists who can overcome obstacles and give expression to these broader and timeless themes that unite people across borders and cultures. And sadly, artists across the world are struggling with their mental health and financials, particularly following the closure of theaters in 2020.

Cultural workers contribute only 1.4 percent to the nation’s GDP even though they make up 3.2 percent of the workforce. To be sure, the arts in Greece needs many more reforms to attract talent from outside the country and retain talent inside the country. Consider the following example. “Greece imports annually 181 million euros of cultural products and only exports 110 million euros. However, Greek museums, which are currently showing only 7% of their collections (with the rest of the artifacts in storage) cannot loan out to international institutions against a fee... they cannot exploit their collections to bring in money to develop themselves and offer working opportunities to newly qualified staff,” according to Yerassimos Yannopoulos, managing partner at Zepos & Yannopoulos.

While Maria Callas’ legacy is anchored in the arts, her influence extends into many sectors and the conditions surrounding her early Athenian career prior to World War II were more challenging than those faced in Athens today, so the need for continued reform should not stop people from persevering through adversity and reaching their potential.

An optimistic example of such reforms include the emerging Greek Tech Visa program led by Endeavor Greece, which will help source talent and, therefore, pave the way for more startups and established companies to build hubs in Greece. Such reforms serve to not only increase economic productivity within the country, but also make Greece a year-round destination — not just for tourism over the summer — and, in the process, the demand for the arts will grow.

Successfully producing an opera involves hundreds of moving pieces on and off stage, ranging from set design to costumes to lighting to stage management to the actual performers themselves executing on the vision laid out by the director with a common objective: to wow the audience and stimulate creativity long after the performance has ended. And so it goes in any economy: each part must be synchronized with the other for the whole to reach its intended objective, which is ultimately human flourishing in the midst of change, whether in times of stability or times of precarity.

If a single singer can become a symbol of national pride, hope, and creativity 100 years after her birth, what could a whole nation do operating in concert with one another?

Read More
Christos Makridis Christos Makridis

Silicon Valley Bank was the tip of a banking iceberg

This article was originally published in Cointelegraph.

Traditional financial institutions take deposits from customers and use them to make loans. But they loan out much more than what they have in store at a given point in time — a concept known as fractional banking. On one hand, the difference between the interest on the loans and the interest paid to depositors is referred to as the net interest margin and determines a bank’s profitability. On the other hand, the difference between the assets and liabilities is referred to as their equity and determines the bank’s resilience to external shocks.

Before the latest run on the bank, SVB was viewed as not only a profitable banking institution but also a safe one because it held $212 billion in assets against roughly $200 billion in liabilities. That means they had a cushion of $12 billion in equity or 5.6% of assets. That’s not bad, although it is roughly half the average of 11.4% among banks.

The problem is that recent actions by the United States federal reserve reduced the value of long-term debt, to which SVB was heavily exposed through its mortgage-backed securities (roughly $82 billion). When SVB flagged to its shareholders in December that it had $15 billion in unrealized losses, wiping out the bank’s equity cushion, it prompted many questions.

On March 8, SVB announced it had sold $21 billion in liquid assets at a loss and stated that it would raise money to offset the loss. But that it announced a need to raise more money — and even considered selling the bank — concerned investors significantly, leading to roughly $42 billion in attempted withdrawals from the bank. Of course, SVB did not have sufficient liquidity, and the Federal Deposit Insurance Corporation took over on March 17.

The macro-finance literature has a lot to say about these situations, but a good summary is to expect highly non-linear dynamics — that is, small changes in inputs (the equity-to-asset ratio) can have substantial changes on output (liquidity). Bank runs may be more prone during recessions and have large effects on aggregate economic activity.

Pursuing structural solutions

To be sure, SVB is not the only bank that has higher and risky exposure to macroeconomic conditions, such as interest rates and consumer demand, but it was just the tip of the iceberg that hit the news over the past week. And we’ve seen this before — most recently during the 2007–2008 financial crisis with the collapse of Washington Mutual. The aftermath led to a surge in financial regulation, largely in the Dodd–Frank Act, which expanded the authorities of the Federal Reserve to regulate financial activity and authorized new consumer protection guidelines, including the launch of the Consumer Financial Protection Bureau.

Of note, the DFA also enacted the “Volcker Rule,” restricting banks from proprietary trading and other speculative investments, largely preventing banks from functioning as investment banks using their own deposits to trade stocks, bonds, currencies and so on.

The rise of financial regulation led to a sharp change in the demand for science, technology, engineering and math (STEM) workers, or “quants” for short. Financial services are especially sensitive to regulatory changes, with much of the burden falling on labor since regulation affects their non-interest expenses. Banks realized that they could reduce compliance costs and increase operational efficiency by increasing automation.

And that’s exactly what happened: The proportion of STEM workers grew by 30% between 2011 and 2017 in financial services, and much of this was attributed to the increase in regulation. However, small and mid-sized banks (SMBs) have had a more challenging time coping with these regulations — at least in part due to the cost of hiring and building out sophisticated dynamic models to forecast macroeconomic conditions and balance sheets.

The current state-of-the-art in macroeconomic forecasting is stuck in 1990 econometric models that are highly inaccurate. While forecasts are often adjusted at the last minute to appear more accurate, the reality is that there is no consensus workhorse model or approach to forecasting future economic conditions, setting aside some exciting and experimental approaches by, for example, the Atlanta Federal Reserve with its GDPNow tool.

But even these “nowcasting” tools do not incorporate vast quantities of disaggregated data, which makes the forecasts less germane for SMBs that are exposed to certain asset classes or regions and less interested in the national state of the economy per se.

We need to move away from forecasting as a “check-the-box” regulatory compliance measure toward a strategic decision-making tool that is taken seriously. If the nowcasts do not perform reliably, either stop producing them or figure out a way to make them useful. The world is highly dynamic, and we need to use all the tools at our disposal, ranging from disaggregated data to sophisticated machine learning tools, to help us understand the times we’re in so that we can behave prudently and avoid potential crises.

Would better modeling have saved Silicon Valley Bank? Maybe not, but better modeling would have increased transparency and the probability that the right questions would be asked to prompt the right precautions. Technology is a tool — not a substitute — for good governance.

In the aftermath of Silicon Valley Bank’s collapse, there has been a lot of finger-pointing and rehashing of the past. More importantly, we should be asking: Why did the bank run happen, and what can we learn?

Read More
Christos Makridis Christos Makridis

Corporate America Shouldn't Let Politics Get In the Way of Company Morale

This article was originally published in Real Clear Markets with Jeremy Tedesco.

In the age of social justice activism and virtue signaling, a growing number of corporations are caving to external and internal activists’ demands that they use their resources and brands to advocate particular political outcomes on contentious social issues.


Whatever near-term points corporations think they gain from pandering to political activists, a recent study and poll indicate that these corporations risk sustained alienation of their current and prospective employees (and their consumers, too) from such short-sighted behavior.


The 2023 Viewpoint Diversity Score Freedom at Work Survey, conducted by Ipsos, surveyed over 3,000 American adults employed across a wide variety of professions. A research paper analyzing the results concludes that “companies could increase employee engagement and trust over their products and services by creating a climate where people feel comfortable expressing themselves without fear of unintended consequences on their career and life.”

Among the survey’s wide-ranging questions, respondents were asked whether they support parental-rights-in-education laws—like the one Florida adopted last year—that “protect the freedom of parents to decide what their kindergarten through 3rd grade children are taught in the classroom about sex and gender identity by limiting what teachers can discuss and requiring notification and consent of parents before sensitive topics can be addressed.”

Fifty-five percent of participants supported such legislation, compared with only 14 percent who oppose it. As our paper discusses, even when respondents who were initially unaware of the Florida parental rights legislation learned more through an information treatment, they reported increased discomfort with current or prospective employers taking stands against parental rights. Simply put, the more participants learned about parental rights bills, the more they supported them.

This broad employee support for parental rights laws like Florida’s is notably out of sync with the widespread corporate opposition to them. Two-hundred-eighty-four large corporations (including The Walt Disney Corporation, Starbucks, Target, Apple, and financial institutions including Deutsche Bank and PNC Financial Services Group) signed the Human Rights Campaign’s statement opposing such state-level legislation.

The disconnect between the C-Suite and employees on social issues can seriously harm a firm’s ability to retain and recruit employee talent. A plurality of employees (44%) say they are uncomfortable with their employer taking a stance on a controversial cultural issue that contradicts the views of many employees and customers. Forty-two percent say that perceptions of hostility against religious or political views make them much less likely to apply to a company. And 30% say they have considered changing jobs to live in a state or region that is more tolerant of their values.

The survey also suggests that corporate political activism is spilling over into the workplace, creating an impression of an intolerant culture where employees fear they will lose their jobs if their heterodox religious or political views become known. Large majorities (60% and 64%) say that respectfully expressing religious or political viewpoints would “likely or somewhat likely” have negative consequences on their employment.

Nearly half of those surveyed have not shared their personal views about a social or political issue because of fear that sharing them would harm their career. Roughly a fifth have encountered negative treatment or discrimination for respectfully communicating their religious or political views. And 54% say they are very or somewhat concerned that sharing political content on their own social media accounts could result in negative consequences in the workplace.

Employees shouldn’t fear that their religious or political views could cost them their job. But the Freedom at Work Survey shows that a significant number of employees do.

If corporate America wants to earn and retain the trust of employees, companies will have to do far better when it comes to respecting their diverse religious and political views. They can start by adopting several Viewpoint Diversity Score standards and best practices that received significant support from survey respondents.

First, companies should include respect for a wide range of religious and political beliefs as part of their commitment to diversity (66%). Second, they should adopt a policy that commits to respecting viewpoint diversity in the workplace (49%). Third, they should adopt a policy that respects the freedom of employees to engage in political activity on their own time, without having to fear repercussions at work (48%). And companies cannot just give lip service to these principles; if they don’t genuinely allow the freedom of expression among all, they will bear the costs.

It's no secret that America is deeply divided, but corporations shouldn’t fan those flames. Instead, they should create a workplace culture based on mutual trust and respect for religious, political, and other differences. The employees who participated in the Freedom at Work Survey charted a path for companies that want to build back trust. The only question that remains is, will business leaders listen?

Read More