High Gasoline Prices Signal Trouble to Older Americans
This article originally appeared in the Wall Street Journal.
Americans haven’t forgotten 1973. Every time oil prices spike, people feel worse about the economy almost immediately, long before any paycheck shrinks or layoff notice arrives. With one-year West Texas Intermediate futures having climbed from about $55 to $75 a barrel since early 2025, that sentiment deterioration is already under way—and because sentiment not only reflects but shapes spending, the consequences are tangible.
Scaled to the current oil-price move and the share of Americans for whom the 1970s remain the economic reference point, the implied drag on consumer spending over the coming months runs to something in the range of 8% to 12% for that cohort—but closer to 3% to 5% for younger consumers, whose sentiment is less tightly coupled to gasoline prices and whose spending responds accordingly.
The evidence behind those numbers reveals something important about economic psychology. Using 2008-17 data from the Gallup Daily Poll of more than 1.7 million Americans, one of us (Mr. Makridis, with Carola Binder) published research showing that consumer sentiment darkens within days of a gasoline price increase—and the effect is roughly 50% stronger among Americans old enough to have lived through the oil crises of the 1970s. People who watched unemployment climb from 4.6% to 9% in 18 months, who sat in gas lines, who absorbed the message that rising energy prices mean recession never quite unlearned it. When they see $4.50 at the pump, they sense trouble.
Sentiment also shapes economic reality. When people expect hard times, they spend less—and that helps bring about the hard times. Research from Mr. Makridis linking consumer beliefs to nondurable spending finds that a one-standard-deviation deterioration in economic confidence is associated with a 17% to 25% decline in consumption expenditures. The current oil price move—roughly 40% since early 2025—is associated with approximately a 0.15- to 0.20-standard-deviation decline in sentiment based on estimated elasticities, implying a consumption drag of 3% to 5% among affected households before any change in their actual financial circumstances.
Gasoline prices are well-suited to produce this effect. They are posted on signs at every intersection, updated daily, visible even when your tank is full.
The same price move reshapes housing markets more slowly and far less uniformly—rippling through the country, lifting some places and pressing down on others. In cities whose economic base is energy—Houston, Oklahoma City, Midland-Odessa, Texas—the income channel dominates. Wages rise, local employment expands, and housing markets follow. Research by one of us (Mr. Larson, with Weihua Zhao) tracking four decades of ZIP Code-level house prices estimates that for a city with roughly half its export employment in oil-related sectors, a 50% rise in oil prices produces roughly 15% higher citywide house-price appreciation over five years compared with non-oil cities.
Everywhere else, rising oil prices function as a tax on transportation, and that tax compounds with distance. Suburbs beyond 15 miles from a city center face the steepest relative losses as higher commuting costs get capitalized into home values. The research finds a doubling of oil prices produces roughly 1.5% to 3% of relative house price underperformance for suburban properties vs. center-city ones. A 40% oil-price move—close to what we’ve seen since early 2025—implies perhaps 1 to 2 percentage points of relative suburban drag over the next several years.
When oil prices rise, the gains in energy-producing cities are concentrated and relatively swift. When they fall, those same cities tend to give back appreciation sharply—Houston lost roughly 30% of its real house value when oil crashed in the 1980s. Whatever appreciation is accumulating in Midland and in Williston, N.D., today carries that tail risk embedded within it.
None of this constitutes a forecast of local booms or national recession. If the supply-side constraints with the Strait of Hormuz are resolved, we may very well see a corresponding fall in oil prices and an upward swing in consumer sentiment by summer. Additionally, the channels through which oil affects the economy have shifted considerably since the 1970s, and the U.S. is a meaningfully different energy producer today than it was then. But the behavioral channel—the one running from gasoline prices through sentiment to spending—doesn’t require structural damage to the economy to operate. It requires only that enough people, seeing a number on a sign, conclude that something familiar and bad is beginning again.
That connection, forged in experience 50 years ago, has proved remarkably durable. It is being tested again.
Why Moving Career Pathways to the Labor Department Is an Opportunity
This article originally appeared in The 74 Million.
The recent federal transition that shifted day-to-day administration of career-oriented pathways and career and technical education into the U.S. Department of Labor reflects a growing recognition that workforce preparation fails when it is governed as schooling alone rather than as a pipeline into jobs, wages and advancement.
There is no shortage of credentials in the U.S. labor market. There is a shortage of matched skills and reliable pathways. Job openings remain historically elevated despite cooling in some places. Even so, a persistent share of young adults are neither enrolled nor working, signaling weak attachment to both employment and further training.
This gap is not a 2025 phenomenon. For decades, policymakers have invested in education while assuming that labor market integration would follow. The evidence suggests otherwise. The impacts of education vary sharply by field, institution and completion status; and many credentials deliver little labor market value relative to their cost. Treating enrollment and completion as success metrics has obscured whether programs actually improve employment and earnings.
CTE was intended to create clearer routes into work. The evidence shows positive effects on several high school outcomes, limited and uneven evidence on postsecondary and earnings outcomes, and large gaps in what has been rigorously evaluated.
Many CTE programs are well intentioned and well funded, yet weakly connected to labor demand. Program offerings frequently lag local employer needs. Credentials are not always portable across firms or regions. Accountability focuses on compliance and participation rather than job placement and earnings. Students complete programs without clear signals about whether those credentials will translate into work. Employers remain skeptical of what certificates represent.
These outcomes are not accidental. They reflect governance. The Department of Education was designed to administer grants, regulate institutions and ensure access. These are necessary functions, but they are not sufficient for building labor market pathways. Education agencies are not structured to continuously track employer demand, validate occupational skill standards or adjust programs based on employment outcomes.
By contrast, the Department of Labor already operates systems that define success in labor market terms, including placement, earnings and retention under the Workforce Innovation and Opportunity Act.
Shifting CTE administration toward Labor aligns authority with objective. To ensure that this move is not just symbolic, policy should be governed by institutions that measure and manage those outcomes. The lesson for CTE is not ideological. It is operational. Here are three design choices.
First, employer leadership must be real, not advisory in name only. Employers should hold decision-making authority over occupational standards, credential validation and program relevance, with transparent governance and conflict of interest rules. Without employer control and input into the curriculum, pathways drift toward provider convenience.
Second, funding must be tied to outcomes that matter. Completion alone is insufficient. Programs should be evaluated on job placement, earnings, retention and progression, adjusted for local labor markets. Chronic underperformance should lead to canceling or revising programs.
Third, the system must allow for competition among multiple providers. Community colleges, employer consortia, nonprofits, high-schools and high quality private providers should operate on equal footing — even if they pursue the goals differently.
Of course, pathway rules should be periodically reviewed and reauthorized, and the Labor Department is well-suited to provide review. Labor markets change faster than education systems. Sunset provisions force adaptation and prevent regulatory accumulation that freezes outdated models in place.
Critics often argue that tighter alignment with labor markets narrows education and reduces flexibility. The evidence suggests the opposite. The current system narrows options by steering students toward debt-financed pathways with uncertain payoffs while offering limited transparency about outcomes. Clear labor market signals expand choice by allowing students to compare pathways based on real consequences rather than marketing or tradition.
A well-designed pathway system does not lock individuals into a single occupation. It creates stackable credentials, portable skills and bridges to further education. It treats employment not as the endpoint of learning but as a core component of it.
The research on education governance offers a cautionary lesson: Incentives matter. Systems respond to what is measured and rewarded. When accountability emphasizes inputs and compliance, organizations optimize for those metrics, even when outcomes suffer.
The federal transition to Labor creates a rare policy opening. It acknowledges that education policy cannot substitute for labor market policy when the objective is work. Whether that acknowledgment leads to better outcomes depends on follow through. Structure matters. Incentives matter. Governance matters.
If CTE continues to be governed as education with different labels, results will not change. If it is governed as labor market infrastructure, it can finally function as intended.
Arguing for AI in the Classroom
This article originally appeared in Education Next.
Standardized testing plays an important role in American education. National and state tests make it possible to compare academic achievement across schools, identify gaps, and track progress over time. However, many of the skills that families and employers say they want students to have, especially soft skills, decision-making, and intellectual curiosity and grit, are hard to capture in a short, fixed-response format.
One practical way to teach and assess those skills is structured debate. Debate guides students to make a claim, support it with evidence, and engage seriously with counterarguments, while also providing an entertainment catalyst to drive engagement.
Because debate takes a lot of work to execute, applying it at scale is difficult. Organizations like the National Association for Urban Debate Leagues help expand opportunities, but many students are not exposed to debate or opt not to participate in it as an extracurricular program. Enter Artificial intelligence. AI has the potential to facilitate debate-centered instruction, mitigate the high costs of participation, and improve students’ learning experiences during the school day. Testing remains essential to learning and assessment, but AI could become a complementary tool for cultivating reasoning and communication skills in the classroom.
The Need for a New Learning Model
In many classrooms, student assessment still centers on questions that can be graded quickly and consistently. That structure is understandable but can leave little room for activities that require students to explain their thinking in full sentences and revise their views in light of new information. Debate can help fill that gap. It has long been a powerful way to build cognitive flexibility and communication skills. Yet many students engage in debate only as an extracurricular option rather than a standard component of learning.
International organizations, such as UNESCO and their 2030 framework, have urged schools to invest in honing skills like critical thinking and collaboration because students will face complex social and economic challenges as adults. But access to structured argumentation remains uneven. Schools with more resources are more likely to offer debate programs, while students in rural or low-income communities often have fewer opportunities for sustained practice in reasoned discourse.
While debate-centered instruction has traditionally been constrained by the cost of scaling across students, augmenting the pedagogy with AI would allow teachers to run structured debates more often and use the resulting student work as one input into a broader picture of learning. AI can support the teacher before class, during discussion, and after class. The key design choice is that the teacher sets the goals, the rules, and the norms, while AI handles some of the repetitive preparation and summarization work.
Here are four ways AI can make debate-based instruction more feasible, along with reasons to be cautious about how it is used:
1. A practice partner, not a substitute for peers. An AI tool can pose questions to students, ask them to clarify a claim, and prompt them to consider an alternative explanation. Used well, AI can help students arrive to class better prepared for a face-to-face discussion. Used poorly, it can crowd out the social learning that comes from engaging with other students. Schools should treat AI as a tool for preparation and evaluation, not the main event.
2. Step-by-step support for different skill levels. Teachers often face a wide range of confidence and skill in one classroom. AI can respond to that variation by generating custom sentence starters, examples of evidence, and progressively harder follow-up questions. Personalization makes it easier for hesitant students to participate while still challenging advanced students to refine their reasoning.
3. Better exposure to counterarguments. Students learn more when they must respond to thoughtful objections. AI can help a teacher assemble a balanced set of counterpoints quickly that includes perspectives students might not encounter in their immediate community. Because AI systems can generate errors or misleading claims, teachers should anchor debates in vetted materials and make source checking part of the exercise.
4. Feedback teachers can use, with clear limits. AI tools can summarize a debate, flag where evidence is missing, and highlight patterns in student reasoning that a teacher might miss in real time. That can save time and help target instruction. AI should not be treated as an authoritative grader. Teachers remain the final arbiters of student performance, and schools should be transparent about what student data is collected and how it is used.
How AI Can Reduce Screen Dependence
One concern about classroom technology is that it can increase students’ time on screens and shift their attention away from discussion. Debate-centered instruction offers a useful counterweight to this wariness because debate is, at its core speaking, listening, and reasoning in person. Schools can also design AI-supported debate in a way that keeps devices closed during class.
In a limited-screen version of augmented debate-centered instruction, the teacher uses AI before class to draft a debate prompt, assign roles, and prepare a short packet of readings or evidence. Students annotate those materials on paper and engage in preliminary debate in small groups or as a whole class. After class, the teacher can use AI to organize students’ notes, summarize what claims they made, and identify common gaps to address in the next lesson. Although debate would be facilitated through an AI-supported platform, and the resulting exercise would occur with the help of software and hardware, the final result would be more social interaction and less dependence on technology.
Consider a middle school civics lesson on whether cities should restrict short-term rentals. Students receive a reference sheet of evidence and then open an AI-supported debate platform that assigns roles and keeps time. As students speak, the platform provides quiet, student-specific prompts on screen such as “State your claim in one sentence,” “Cite one piece of evidence from the sheet,” or “Answer the strongest objection you just heard.” Students who need more structure see more prompts. Students who are better prepared see fewer. The teacher circulates and listens. After the debate, the platform generates a brief, student-level report aligned to a simple rubric: clarity of claim, use of evidence, engagement with counterarguments, and civility as defined by class-specific norms. The teacher reviews it, makes judgments where needed, and uses the results to plan a short follow-up writing task that asks students to revise their position based on the best opposing argument they encountered.
What Debate-Centered Learning Builds
Debate pushes students beyond passive recall and toward higher-order thinking. When students argue a position, they practice organizing ideas, choosing relevant evidence, and anticipating challenges. Over time, debate can improve skill transfer because students develop a general method for weighing evidence and revising stances rather than memorizing a single answer.
Debate also strengthens communication. Students must listen closely, speak clearly, and respond under time pressure. The value of this skill-building is supported by longitudinal research. A ten-year study of the Chicago Urban Debate League found that students who participated in policy debate were more likely to graduate high school and scored higher on standardized reading and writing assessments than non-debaters. These gains persisted after controlling for prior achievement and socioeconomic background, suggesting debate had a unique impact on academic skills. Participants also reported greater resilience and academic self-efficacy, consistent with the idea that regularly defending and revising arguments builds perseverance.
Under transparent rules and clear norms, debate can also develop social and civic engagement. Students learn to separate a person from an idea, critique claims respectfully, and treat disagreement as a normal part of learning. In polarized environments, practicing structured disagreement can help students engage contrary viewpoints with curiosity rather than contempt.
Expanding Access to Debate
A major appeal of AI-supported debate is that it can affordably offer argumentation practice to more students. Historically, rigorous debate training has been concentrated in selective programs or well-resourced schools. If AI reduces preparation time, more teachers can incorporate short debates into existing units in English language arts, history, science, and civics.
Of course, implementing AI-supported debate is not automatic. Schools need training, clear formats, and strong classroom norms. But the potential is real in three areas:
1. Wider reach. Teachers in rural and urban schools alike can use AI to generate debate prompts, role cards, and evidence packets that match their curriculum and context, even when local coaching expertise is limited.
2. Flexible formats. Debates can be short and frequent—such as five-minute claim-and-rebuttal exercises—or longer, structured exchanges that span an entire class period. They can also be oral, written, or mixed, depending on student age and classroom goals.
3. Bridging opportunity gaps. Lower-income schools often face staffing shortages, fewer enrichment opportunities, and limited resources for extracurricular activities. If teachers can integrate argumentation into regular instruction with modest added cost, more students can build the skills that debate programs have long delivered to a smaller subset.
Debate-based instruction is not new, but as a means of equipping and evaluating students it can complement the tradition of standardized testing. Tests remain useful for education systems to assess what students know and can do. The question is how they can strengthen the parts of learning that are hardest to measure at scale, like students’ ability to reason through contested questions and communicate their views clearly.
AI-supported debate is one promising tool. When designed carefully, it can help teachers run structured discussions more often, give students more practice responding to counterarguments, and reduce in-class screen time by shifting AI work to preparation and reflection. Given the ubiquity of AI, including in schools, the question is no longer whether to use it but how it can augment the learning experience. Debate provides a natural use case and has the added benefit of improving the way instructors assess competency over an array of metrics.
The most sensible path is to pilot this method, measure its impact on student writing and reasoning, and adopt clear guardrails on accuracy, privacy, and the role of teacher judgment.
Hybrid workers are putting in 90 fewer minutes of work on Fridays – and an overall shift toward custom schedules could be undercutting collaboration
This article originally appeared in The Conversation, Fortune, Fast Company, and more.
Do your office, inbox and calendar feel like a ghost town on Friday afternoons? You’re not alone.
I’m a labor economist who studies how technology and organizational change affect productivity and well-being. In a study published in an August 2025 working paper, I found that the way people allocate their time to work has changed profoundly since the COVID-19 pandemic began.
For example, among professionals in occupations that can be done remotely, 35% to 40% worked remotely on Thursdays and Fridays in 2024, compared with only 15% in 2019. On Mondays, Tuesdays and Wednesdays, nearly 30% worked remotely, versus 10% to 15% five years earlier.
And white-collar employees have also become more likely to log off from work early on Fridays. They’re starting the weekend sooner than before the pandemic, whether while working at an office or remotely as the workweek comes to a close. Why is that happening? I suspect that remote work has diluted the barrier between the workweek and the weekend – especially when employees aren’t working at the office.
The changing rhythm of work
The American Time Use Survey, which the U.S. Labor Department’s Bureau of Labor Statistics conducts annually, asks thousands of Americans to recount how they spent the previous day, minute by minute. It tracks how long they spend working, commuting, doing housework and caregiving.
Because these diaries cover both weekdays and weekends, and include information about whether respondents could work remotely, this survey offers the most detailed picture available of how the rhythms of work and life are changing. This data also allows me to see where people conduct each activity, making it possible to estimate the share of time American professionals spend working from home.
When I examined how the typical workday changed between 2019 and 2024, I saw dramatic shifts in where, when and how people worked throughout that period.
Millions of professionals who had never worked remotely suddenly did so full time at the height of the pandemic. Hybrid arrangements have since become common; many employees spend two or three days a week at home and the rest in the office.
I found another change: From 2019 to 2024, the average number of minutes worked on Fridays fell by about 90 minutes in jobs that can be done from home. That change accounts for other factors, such as a professional’s age, education and occupation.
The decline for employees with jobs that are harder to do remotely was much smaller.
Even if you just look at the raw data, U.S. employees with the potential to work remotely were working about 7½ hours per weekday on average in 2024, down about 13 minutes from 2019. These averages mask substantial variation between those with jobs that can more easily be done remotely and those who must report to the office most of the time.
For example, among workers in the more remote-intensive jobs, they spent 7 hours, 6 minutes working on Fridays in 2024, but 8 hours, 24 minutes in 2019.
That means I found, looking at the raw data, that Americans were working 78 fewer minutes on Fridays in 2024 than five years earlier. And controlling for other factors (e.g., demographics), this is actually an even larger 90-minute difference for employees who can do their jobs remotely.
In contrast, those employees were working longer hours on Wednesdays. They worked 8 hours, 24 minutes on Wednesdays in 2024, half an hour more than the 7 hours, 54 minutes logged on that day of the week in 2019. Clearly, there’s a shift from some Friday hours, with employees making up the bulk of the difference on other weekdays.
Fridays have long been a little different
Although employees are shifting some of this skipped work time to other days of the week, most of the reduction – whether at the office or at home – has gone to leisure.
To be sure, Fridays have always been a little different than other weekdays. Many bosses allowed their staff to dress more casually on Fridays and permitted people to depart early, long before the pandemic began. But the ability to work remotely has evidently amplified that tendency.
This informal easing into the weekend, once confined to office norms, can be a morale booster. But as it has expanded, it’s become more individualized through remote and hybrid arrangements.
Those workers in remote-intensive occupations who are single, young or male reduced their working hours across the board the most, relative to 2019, although their time on the job increased a bit in 2024.
The benefits and limits of flexibility
There are a few causal studies on the effects of remote work on productivity and well-being in the workplace, including some in which I participated. A general takeaway is that people tend to spend less time collaborating and more time on independent tasks when they work remotely.
That’s fine for some professions, but in roles that depend on frequent coordination, that pattern can complicate communication or weaken team cohesion. Colocation – being physically present with your colleagues – does matter for some types of tasks.
But even if productivity doesn’t necessarily suffer, every hour of unscheduled, independent work can be an hour not spent in coordinated effort with colleagues. That means what happens when people clock out or log off early on a Friday – whether at home or at their office – depends on the nature of their work.
In occupations that require continuous handoffs – such as journalism, health care or customer service – staggered schedules can actually improve efficiency by spreading coverage across more hours in the day.
But for employees in project-based or collaborative roles that depend on overlapping hours for brainstorming, review or decision-making, uneven schedules can create friction. When colleagues are rarely online at the same time, small delays can compound and slow collective progress.
The problem arises when flexible work becomes so individualized that it erodes shared rhythms altogether. The time-use data I analyzed suggests that remote-capable employees now spread their work more unevenly across the week, with less overlap in real time.
Eventually, that can make it harder to sustain the informal interactions and team cohesion that once happened organically when everyone left the office together at the end of the week. As some of my other research has shown, that also can reduce job satisfaction and increase turnover in jobs requiring greater coordination.
The future of work
To be sure, allowing employees to do remote work and have some scheduling flexibility on any day of the week isn’t necessarily bad for business.
The benefits – in terms of work-life balance, autonomy, recruitment and reducing turnover – can be very real.
Flexible and remote arrangements expand the pool of potential applicants by freeing employers from strict geographic limits. A company based in Chicago can now hire a software engineer in Boise or a designer in Atlanta without requiring relocation.
This wider reach increases the supply of qualified candidates. It can – particularly in jobs requiring more coordination – also improve retention by allowing employees to adjust their work schedules around family or personal needs rather than having to choose between relocating or quitting.
What’s more, many women who might have had to exit the labor force altogether when they became parents have been able to remain employed, at least on a part-time basis.
But in my view, the erosion of Fridays may go beyond what began as an informal tradition – leaving the office early before the weekend begins. It is part of a broader shift toward individualized schedules that expand autonomy but reduce shared time for coordination.
Stablecoins strengthen the dollar and empower the developing world
This article was originally published in Cointelegraph.
Stablecoins received a real boost when US President Donald Trump signed the GENIUS Act earlier this year — and now European banks are trying to get into the act by issuing stablecoins of their own.
Their envy of the US dollar’s supremacy, a long-standing pillar of American economic strength, is understandable. In the wake of the GENIUS Act, dollar-backed, privately issued stablecoins are surging in popularity, presenting a strategic opportunity for the United States.
By creating an environment that enables stablecoins and operating under the umbrella of US banking infrastructure, the US can reinforce the dollar’s global dominance while democratizing access to finance abroad, particularly in developing countries.
These “digital dollars” have numerous benefits. They can cut fees, shorten settlement cycles, counter local inflation and widen access to trade and finance for smaller companies that struggle with correspondent banking.
The stablecoin surge
Stablecoins have surged in market capitalization, with transactions exceeding $265 billion. Nearly all of that value rides on dollars. Safe assets back each dollar stablecoin, so stablecoin issuers must hold large reserves of US dollars and Treasury bills. Stablecoin reserve demand shifts Treasury bill ownership from bank deposits and money market funds to issuers; the larger ripple effects would arise if this infrastructure facilitates more commerce.
Federal Reserve Governor Christopher Waller noted that if regulators “allow these things to go out, this will only strengthen the dollar as a reserve currency,” since greater stablecoin use means higher demand for dollars and US debt. Secretary Scott Bessent has been even more blunt: “We are going to keep the US [dollar] the dominant reserve currency in the world, and we will use stablecoins to do that.”
Stablecoins and the developing world
For developing countries, integrating with the dollar via stablecoins can unlock sorely needed economic activity. Many of these nations suffer from volatile currencies, high inflation and patchy banking systems. Their citizens often seek refuge in dollars — a phenomenon economists call “dollarization” — but until now, that meant physical cash or costly wire transfers.
Stablecoins change the game by making dollars accessible to anyone with a cell phone. Instead of waiting at a bank and paying high exchange fees, a farmer or shopkeeper can instantly hold digital dollars in a smartphone wallet. Stablecoins are making the world’s most in-demand asset – the US dollar – available on demand, globally.
This has profound implications for financial inclusion. Approximately 1.4 billion adults worldwide remain unbanked, with a substantial proportion residing in Africa and Asia. Stablecoins enable users to save in a stable currency and transact globally without a bank account, thereby bypassing traditional barriers such as ID checks and branch access.
Financial inclusion through stablecoins
In Sub-Saharan Africa, for instance, dollar stablecoins have become a vital tool for payments, savings and commerce amid currency instability. Over 40% of all cryptocurrency transaction volume in Africa is now in stablecoins. Users are even willing to pay a premium for stablecoins; businesses and individuals in emerging markets sometimes pay 5% or more above face value just to obtain digital dollars, which demonstrates their desperate need for a reliable store of value.
Crucially, stablecoins also facilitate commerce. Consider the example of remittances — the lifeblood of many developing economies. Africans abroad sent home $54 billion in remittances in 2023, but traditional channels charge senders an average of nearly 8% in fees. Stablecoins can slash these costs.
In one Kenyan pilot, using stablecoins for cross-border micropayments reduced fees from 28.8% to just 2%, allowing gig workers to keep more of their earnings. Global consultants estimate that over $12 billion a year could be saved in remittance fees if stablecoins replaced wire transfers — money that goes straight into local households and consumption.
Where local banks perceive too much risk or too little profit to lend, stablecoin-based financing and decentralized finance can help fill the credit gap, playing a vital role in facilitating entrepreneurship and growth for African small and medium-sized enterprises.
Stablecoins and their superpowers
Wider adoption of stablecoins in developing countries could also counter the influence of players like China, which has spent years extending loans to poorer nations under onerous terms. As part of the Belt and Road Initiative, Beijing’s overseas lending has left dozens of countries saddled with debts they struggle to repay. In extreme cases, defaulting nations have had to relinquish strategic assets, such as ports and power plants, to Chinese control.
This “debt-trap diplomacy” thrives when nations lack alternative financing options.
By embracing dollar stablecoins and digital finance more broadly, developing countries can raise capital in new ways and unshackle themselves from such predatory arrangements.
Another promising path is tokenizing sovereign debt. Rather than relying exclusively on large foreign creditors, governments can issue bonds in smaller denominations on blockchain platforms, making it easier for local citizens and diaspora investors to participate.
Governments from Kenya to Brazil are already exploring tokenized bonds and Treasury bills that can be purchased and traded via digital wallets. Such decentralized fundraising could help countries refinance or buy back expensive foreign loans — effectively crowd-funding their way out of China’s shadow. Every dollar raised from a diaspora bond or global crypto investor is a dollar that doesn’t have to be borrowed from Beijing on tough terms.
CBDCs in the corner
Central banks have also spotted these opportunities. Dozens of central banks are developing central bank digital currencies (CBDCs) as state-controlled alternatives to private stablecoins. Proponents argue that a government-issued digital currency can increase financial inclusion and modernize payments, but the early evidence is underwhelming.
Nigeria’s eNaira, one of the first retail CBDCs, has flopped – 98% of Nigerians who opened eNaira wallets stopped using them by the end of 2023. Meanwhile, Nigerians continue to flock to dollar-backed stablecoins as a hedge against the plunging naira. This story repeats elsewhere: Enthusiasm for CBDCs often comes from the top down, while stablecoins gain adoption bottom up by meeting real user needs. Even China has had limited success getting other countries to use it, especially when dollar stablecoins already have a considerable head start globally.
Academic research suggests that when central bankers promote CBDC plans, stablecoin activity drops — evidence that rhetoric alone can siphon momentum from the private sector. That might please officials wary of competition, but it can deprive consumers of better services.
Moreover, research compares countries that have adopted CBDCs with those that have not, both before and after adoption, finding that there are no effects on macroeconomic outcomes, such as GDP per capita or inflation, and adverse effects on financial well-being. In short, CBDCs have yet to deliver breakthrough improvements in financial access or efficiency, whereas stablecoins are already doing so.
Encouraging developing countries to use dollar-backed stablecoins is a win-win proposition, functioning similarly to the printed dollar following the supremacy of gold. For the US, it means expanding the influence of the dollar — reinforcing its reserve currency status in the digital era and countering rivals who seek to promote alternative spheres of monetary control.
For developing nations, it means greater access to a stable currency, new pathways for investment, lower transaction costs, and escape hatches from heavy-handed creditors. In an increasingly tense geoeconomic landscape, digital dollars could become a linchpin of a more democratic and resilient global financial system.
The United States is embracing this opportunity: By championing dollar stablecoins and the open financial networks they run on, America can help unlock growth in emerging economies while buttressing its own economic might.
In the contest for hearts, minds and wallets around the world, a little stable currency could go a long way.
The Flexibility Trap: Remote Work’s Hidden Toll on Young Adults
This article was originally published in the Institute of Family Studies.
The rise of remote and hybrid work has reshaped the workweek, especially for younger professionals, according to a recent working paper of mine. Using the American Time Use Survey, I found that Fridays have effectively become the new weekend for remote-capable workers, especially those without children.
From 2019 to 2024, average minutes worked on Fridays fell by about 90 minutes in jobs that can be done from home, whereas on-site jobs saw little change. This pronounced “Friday effect” is largest among younger workers and those without children. One analysis finds that childless employees cut nearly two hours off their Friday work time (from ~542 minutes in 2019 to 427 in 2024), compared to only a half-hour drop for workers with kids. In other words, many 20- and 30-somethings with remote setups are seizing the chance to log off early, especially at week’s end, enjoying a taste of the weekend before it even starts.
The newfound flexibility goes beyond Fridays. Low-coordination, remote-intensive jobs—roles where tasks do not require constant real-time teamwork— have seen a broad decline in hours worked per day. By 2022–2023, employees in such positions were working 65–92 fewer minutes per day than in 2019, allocating a comparable 65–92 minutes more to leisure instead. This suggests that many young remote workers are using the time saved from commuting or the slack of lighter supervision to relax or attend to personal activities. Prior work of mine has also shown that most of the cutback in work time has flowed straight into leisure (with only a small portion diverted to extra household chores). In effect, the short-term benefit of remote work for these individuals is clear: more free time and autonomy in managing their day.
Less Work, More Leisure — But More Time Alone
What are young adults doing with their newfound “free” time? A closer look reveals a concerning pattern: the extra leisure tends to be solitary. Among remote-heavy workers ages 30-33, the share of leisure time spent with others has declined from 70% to 55%, relative to 2019. In contrast, those in jobs requiring in-person presence saw no significant change—and they still spend the bulk of their leisure hours with friends or family.
Why might this be happening? One reason is asynchronous schedules. A 28-year-old who signs off early Friday afternoon may find that many friends or family are still working or are geographically scattered. The once-common ritual of coworkers grabbing happy-hour drinks or young professionals unwinding together at week’s end has become rarer for those fully remote. In its place might be a Netflix session, solo gym visit, or just additional screen time at home. What at first sounds like work–life balance—trading an hour of work for an hour of “me time”—can morph into social isolation if that leisure time isn’t shared with others.
Isolation and Young Adult Well-Being
Emerging data on the well-being of young adults raise red flags about this isolation. Even before remote work became widespread, researchers noted a decline in the mental and social health of Millennials and Gen Z. The expansive Global Flourishing Study finds that in many countries—including the United States—mental health is a significant “flourishing deficit” for young adults. In the U.S., for instance, the average self-reported mental health score is only about 5.7 (out of 10) among 18–29-year-olds, versus 8.1 for those in their 60s. In 2025, around 83% of young adults said they had experienced feelings of depression in the past two weeks, a rate nearly two-and-a-half times that of senior citizens. Similarly, about 34% of young adults report feeling lonely frequently, far higher than older groups.
Gallup data likewise show that globally 20% of employees feel lonely, with younger and fully remote workers feeling it most. In fact, Gallup’s workplace research calls Gen Z the “loneliest generation”: Gen Z employees are almost three times as likely as Baby Boomers to say they experienced “a lot” of loneliness the previous day. This loneliness has a direct connection to remote work preferences. Notably, Gen Z is the least interested in fully remote roles, with many young workers actively craving more in-person interaction on the job to combat isolation.
All of this suggests that social connection is not a luxury but a lifeline for flourishing in early adulthood. Psychologists have long known that strong relationships are critical for mental health and life satisfaction. The Global Flourishing data further reinforce this: those in romantic partnerships score significantly higher on well-being than those who are single, largely thanks to the support and sense of belonging that close relationships provide. Conversely, when young people feel adrift or alone, their overall life satisfaction and purpose tend to falter. If remote work arrangements inadvertently lead young adults to spend more time alone, they may amplify exactly those conditions—isolation and disconnection—that feed into poorer mental health.
The consequences of prolonged social isolation in one’s 20s and 30s extend beyond immediate mental health. These years are a formative period for building the foundations of adult life—careers, skills, friendships, and families. A fully remote or highly isolated work style can subtly undermine these developmental milestones in several ways:
Loss of “Social Scaffolding” at Work. In traditional offices, young employees benefit from informal interactions that help them grow. Think of chatting with a senior colleague who offers career advice, or observing how seasoned professionals handle challenges. In a remote setting, much of this osmosis is lost. Unlike older workers who already have established networks and confidence, newcomers depend on guidance and encouragement from managers and mentors. Over time, this could slow their career development and even sap their engagement.
Fewer Pathways to Meet Partners. Young adulthood is also when people tend to form long-term romantic relationships. Historically, the workplace has been one common venue to meet a significant other. In the 1980s, nearly 1 in 5 couples in the U.S. met through work. Today, that figure is just around 10%, a decline driven partly by the rise of online dating but likely to be exacerbated by remote work. Fewer days in the office mean fewer casual conversations that spark a friendship or more. Of course, meeting a partner is increasingly moving to dating apps and social media, but those digital avenues do not fully replace the trust and context that can come from getting to know someone in person over time.
Weakening Community Ties. Beyond the workplace, a fully remote lifestyle can encourage geographical drifting. On one hand, remote jobs enable young adults to move anywhere—often away from hometowns or high-cost cities— which can be positive for affordability. On the other hand, this freedom can also mean that young workers end up living in new locales where they have no built-in community. They might find themselves working from a small apartment in a city where they do not know anyone, or bouncing to a new location every year. The decline in daily in-person interaction may not immediately register as a problem—until one day, the remote worker looks up and realizes he hasn't felt truly known or supported by a community in a long time.
Counting the Cost of Flexibility
None of this is to suggest that we should revert to the 9-to-5, five-days-in-office grind of old. Remote and hybrid work offer genuine benefits: greater flexibility for family needs, less time wasted on commuting, the ability to hire and work from anywhere, and often higher productivity for focused tasks. For example, many parents, especially mothers, who would otherwise have to drop out of the labor force, are able to continue working, even if part-time. But as we embrace the convenience of remote work, we must also count its cost—not just in economic terms, but in human terms. The experiences of younger workers serve as an early warning. They are reporting record-high loneliness, declining mental health, and fewer close relationships.
How might today’s work-from-home norms reshape the social and emotional development of young adults in the long run? Will the 30-year-old of 2030 be more likely to struggle with anxiety and loneliness because they spent their formative work years isolated in a bedroom office? Could the trends we see now —later marriages, fewer friendships, weaker professional networks—accelerate under a regime of minimal in-person engagement? These are not just individual concerns, but societal ones. Fewer connections and lower flourishing among young adults can have ripple effects on community life, civic engagement, and family stability in the years ahead.
To avoid drifting into a future where flexibility comes at the expense of fulfillment, employers and families alike should take these trade-offs seriously, experimenting with initiatives like:
Establishing “in-office anchor days,” targeting especially younger staff for mentoring sessions and team-building activities.
Crafting “team charters” that ensure hybrid work still includes synchronous collaboration and social interaction, thereby mitigating burnout.
Formalizing mentorship opportunities that pair junior and senior employees around specific projects and measurable goals.
Coordinating volunteer and civic engagement opportunities outside of the workplace to further build trust and camaraderie.
In the end, the measure of a life, or a career, well lived is not only the output we produce, but the relationships and meaning we cultivate along the way. Flexibility in work is a double-edged sword: it can optimize personal comfort and family life, yet also gradually erode the informal bonds and growth experiences that help young adults flourish. As we navigate this grand experiment in remote work, we would do well to remember that efficiency is not enrichment. Only by counting (and correcting for) the social costs of remote work can we ensure that the next generation thrives—not only at work but also in their personal lives.
Remote Staff Hours Fall, but Productivity Steady (For Now)
This article was originally published in Gallup News.
As remote work and hybrid work became mainstream in the wake of the pandemic, many leaders have asked these questions: Are remote workers really working? What does that mean for productivity?
The answer is nuanced. Remote workers are spending less time working, but the relationship between remote work and productivity is more complex.
Remote Employees Are Working Less
A recent study based on data from the American Time Use Survey (ATUS) from 2019 to 2023 found that full-time employees in remote-capable jobs are spending less time on work and more on personal activities.
By 2022, people in heavily remote roles were working about an hour less per day than in 2019, on average. Of that time, they were redirecting 30 to 60 minutes to leisure, a trend consistent with broader remote work productivity statistics. This decrease goes beyond reduced commute time, with the drop in commuting time accounting for only a small fraction of the reduction in work hours.
Some groups reported working even less. In jobs open to telework, men, unmarried adults, and those without children showed the steepest declines in hours worked and the greatest gains in leisure time. For example, single men over 45 who work remotely clocked over two hours less per day on work activities in 2022 than in 2019. Women saw even larger declines in hours worked, although that is driven largely by those without a college degree.
These findings align with more recent Gallup studies on hours worked and emerging remote work trends in the broader workforce. In 2019, U.S. employees reported working an average of 44.1 hours. In 2024, they averaged 42.9 hours per week.
Productivity Benefit of Remote Work: Increased Talent Pool
Perhaps the greatest concern in boardrooms about reduced work hours is the potential hit to productivity. If employees are working 10% fewer hours, will output or innovation fall by 10%?
Not necessarily.
After adjusting a model where employees choose jobs based on their capabilities and preferences, the study finds a slight increase in output per worker in the economy. This growth did not result from employees in more remote-intensive jobs working more productively. Instead, people were better able to sort into roles that were better suited for them, and employment shifted toward sectors with higher output per worker.
Organizations that broke free from geographic constraints and hired the best-fit talent for each role — regardless of location — experience a boost in remote work productivity. While the ATUS does not capture information on the quality of managers, prior Gallup data show that managers play a vital role in how technological change affects employees. That is, increases in technology tend to have positive effects on workers, but those effects are greater when managers build trust in the workplace. The bar for managerial quality is likely even higher for those leading remote employees. Knowing how to manage a remote team is now a core leadership competency. It affects how clearly organizations set expectations, manage performance and build trust.
Although the model addresses sector-level and overall productivity, the benefits for individual organizations are more complex. They depend not only on the types of tasks being done and how suitable they are for remote work, but also on the makeup of the talent pool. Growing evidence shows that being near coworkers can have positive spillovers for productivity, and the effect of working in the same location on communication depends on the type of work and whether the interaction is between employees or between employees and managers.
These factors are especially important for younger or newer employees who may not yet have established routines or communication patterns within the organization. For them, working in the same location as their coworkers could provide more benefits.
Remote Work Increases Job Satisfaction — If the Boss Is Bad
Even if the productivity benefits of remote work are mixed, employers might offer remote flexibility as a perk to attract and retain quality employees. Gallup data on hybrid work suggest as much: 76% of hybrid workers say “improved work-life balance” is one of the “greatest benefits” of hybrid work. For many, the ability to work from home offers greater autonomy and flexibility. That is consistent with several randomized experiments assessing the effects on retention, including those by Nicholas Bloom and coauthors. But companies need to recognize how fully remote strategies can go awry by attracting people who are less likely to put in discretionary effort.
Company culture, however, has a stronger influence on employees’ feelings about their workplace than location. Another recent study found that workplace factors — such as feeling appreciated and receiving clear communication, among other workplace practices — explain most of the differences in job satisfaction and intent to leave.
Meanwhile, remote work is linked to job satisfaction in the raw data. But that link disappears when accounting for the previously mentioned workplace factors, except for one group: employees who “sometimes work from home.” In other words, workers value some flexibility, but culture and management matter more.
The study also shows that the benefits of remote work are not the same for everyone — they vary based on the type of work and the quality of the manager. This suggests that remote work can feel like a benefit when management falls short, but it does not raise performance on its own.
In these cases, fully remote work arrangements may help individuals make the best of a bad situation, but that is a workaround, not an organizational strategy.
The Best Hybrid Work Model Focuses on Culture and Fit
The future of hybrid work and remote work is already here. What matters now is how much flexibility to allow, how it works in practice and how organizations manage the risks. These choices rest with managers and organizational leadership.
Build a strong workplace culture first. Most variation in job satisfaction and intent to leave comes from how employees view their workplace practices, not from compensation. Hybrid work can support engagement, but it cannot replace sound management. High-quality management remains a competitive advantage.
Assess your workforce and how work gets done. Remote work fits some tasks better than others. Organizations need to understand the factors that influence successful client outcomes and how these are evolving with the economy and technology. Use both remote work and on-site collaboration in ways that elevate performance.
Use remote work to expand your talent options and improve role fit. Productivity increases when people are doing work that suits their talents and strengths. Remote and hybrid work arrangements create more ways to match employees to the right tasks under a clear talent strategy.
Bottom Line
Less time spent working does not automatically mean lower output. If anything, the shift to hybrid and remote models has helped many organizations make better use of each employee’s talents. But the declining trend in time allocated to work, particularly among remote-capable employees, and the deteriorating employee engagement trend Gallup has documented for years indicate a broader risk. With this risk in mind, leaders need to ensure remote work flexibility strengthens — not erodes — long-term engagement and performance.
States’ pushback against ESG finance contains key lessons for powering AI
This article as originally published in The Hill.
Artificial intelligence is not only disrupting labor markets and reshaping geopolitics, but also becoming one of the most energy-intensive technologies of our time. In fact, it is placing an already overloaded power grid under even more strain.
Recent estimates from the International Energy Agency suggest that by 2026, global data centers are projected to more than double their electricity consumption, potentially exceeding 1,000 terawatt-hours. This will be driven not just by the energy needed operate and cool chips, but also by the vast computational demands of AI model training and deployment.
This surge in demand is occurring just as utilities across the U.S. are warning of capacity shortfalls. Worse, regulatory bottlenecks and climate targets over the last four years made it harder to bring new energy supply online. In short, we are not on track to meet the energy demands of an AI-driven economy.
What happens when national-scale ambitions conflict with local economic needs? One overlooked episode — centered on the backlash over state-level environmental, social, and governance, known widely as ESG — offers perspective.
ESG investing aimed to encourage corporate responsibility, but it also introduced new constraints on energy financing and infrastructure development. When major financial institutions began limiting support for oil, gas, and other politically disfavored sectors, some U.S. states responded with legislation barring public contracts with those firms.
Texas led the charge in 2021, passing laws that effectively pushed several of the country’s largest municipal bond underwriters, such as JPMorgan and Citigroup, out of the state’s market. Critics warned this would raise borrowing costs for local governments. But in a new working paper, we examined the actual financial impact using comprehensive data on bond yields between 2017 and 2024. We found that even in large and complex deals, where underwriting relationships matter most, the exit of ESG-sensitive firms did not significantly affect pricing.
Texas’s policy led to no systematic increase in borrowing costs. We also found that the same was true in Oklahoma after it adopted a similar policy in 2023.
What explains this null result, which runs counter to what some ESG proponents predicted and expected? Partly, it reflects long-term shifts in the structure of the municipal finance market. Underwriting spreads have declined over the last two decades. Competition has also intensified, and many states — especially those with zero income tax — retain robust investor demand.
But the deeper insight is this: When states push back against perceived overreach by firms or ratings agencies, markets often adjust. Other underwriters step in, investors recalibrate and life goes on.
This lesson matters because AI’s energy appetite is forcing a similar reckoning. National climate goals have prioritized decarbonization, often by sidelining traditional energy sources before viable alternatives are ready. Meanwhile, capital has flowed toward green technologies unable to cost-effectively scale — arguably at the expense of system reliability.
To train a single advanced AI model can consume several gigawatt-hours of electricity — roughly equivalent to powering hundreds of U.S. homes for a year. And inference costs (referring to the energy needed to run these models at scale) could dwarf training in the long run. Meeting this challenge will require not only new capacity, but also investments in energy efficiency and grid interoperability, so that power flows can dynamically match shifting demand across regions and time zones.
The National Energy Dominance Council, established by President Trump in February, is a pragmatic step to bridge the gap between national ambition and on-the-ground implementation. By coordinating states and private stakeholders, the council is working to accelerate permitting timelines, identify high-impact transmission corridors, and streamline regulatory processes that often slow energy infrastructure.
Its convening power has also facilitated more transparent negotiations between utilities and major corporate energy buyers, encouraging market-driven investments in renewables while reinforcing grid reliability. Rather than impose top-down mandates, the council is helping align incentives and remove bottlenecks that have historically stalled progress.
In this environment, states cannot afford to be passive. They must act to secure their energy futures both to meet AI-driven demand and to maintain economic competitiveness. Some are already doing so. Arkansas, for example, has launched efforts to fast-track natural gas permitting. Georgia is investing heavily in nuclear and grid modernization. And Texas, despite the fallout from its 2021 winter storm, remains the largest generator of wind power while also expanding natural gas generation to stabilize supply.
Critics may worry that such moves conflict with broader climate goals. But most would agree that the alternative — a brittle grid, rolling blackouts, and politically opaque AI rationing — is worse. Policymakers at all levels should recognize that reliability and capacity are not optional in a digital economy. Nor is energy neutrality a viable long-term position.
If ESG finance taught us anything, it’s that states can and do push back when national trends threaten their core interests. The market consequences of these interventions are not always dire. Sometimes, they are neutral. Occasionally, they even improve outcomes by correcting for one-size-fits-all approaches that misalign incentives.
With the energy transition entering a new phase, and AI accelerating that shift, now is the time to reexamine the balance of power between state priorities and federal, or corporate, initiatives. We cannot afford to sleepwalk into an AI-powered blackout.
AI in the Classroom: It Needs More Than Guardrails—It Needs Purpose
This article was originally published in the Institute of Family Studies.
Recent debates over artificial intelligence in schools have understandably zeroed in on the risks. Educators and parents worry about biased algorithms, invasive data practices, and other harms from rushed AI adoption. These concerns are well-founded. Research shows that AI systems can introduce risks and harms that extend beyond bias and discrimination in education.
But treating AI merely as a threat to contain addresses only half the equation: AI is here to stay, and how we manage its risks and design it to augment human capabilities is what matters most. In this sense, the relevant comparison is not between AI and a perfect world, but between AI and the status quo—one that already includes harms and failures by fallible humans.
Missing from the current conversation is an affirmative vision for what we want AI in education to achieve. Yes, guardrails are needed, but so is a guiding star. AI in education is not just an external force to be fenced in; it is also a tool whose impact will ultimately reflect the values and goals we build into it. The question, then, is not only “How do we prevent harm?” but also “How do we design AI to actively promote the well-being and growth of students?”
The success of AI in schools will depend on whether its implementation remains human-centered. This shift in framing—from damage control to purposeful design—opens the door to a more constructive approach. Rather than aiming for the absence of negatives, we should set our sights on the presence of positives: safer, healthier, more enriching learning experiences.
In a recent working paper, we introduce a framework called “Flourishing by Design” that builds on the Global Flourishing Study (GFS). The GFS was led by Baylor University and the Harvard Flourishing Program, in partnership with Gallup, and included a longitudinal survey of over 200,000 people across 22 countries along six dimensions of human flourishing. Building on this vein of human development and well-being research, our framework can be applied to learner flourishing and the role of AI. We contend that technology ethics should go beyond box-checking and “ethics washing,” embedding them into the very fabric of product and policy development, tied directly to multi-dimensional outcomes that matter for students’ lives.
Put differently, when companies—especially in the tech sector—build products or services, they need to think about the end-use and the impact on human flourishing from the start. If we had done that from the onset of the internet revolution, we would have setup property rights over data (instead of letting digital intermediaries extract and monetize our digital footprints), and created social media platforms that promote meaningful relationship building (instead of lead to hyper personalization and “keeping up with the Joneses” phenomena)
One clear area where AI could support flourishing is by cultivating intellectual tenacity, which refers to the willingness to engage with difficult problems, resist premature closure, and revise one’s beliefs when faced with new evidence. Current educational models often reward speed, correctness, and compliance over thoughtful perseverance. AI systems, if intentionally designed, could help reverse this trend. For example, rather than steering students toward the fastest path to the right answer, an AI tutor could detect when a learner is struggling productively and offer prompts that encourage deeper inquiry: “Would you like to explore why this approach didn’t work?” or “Try explaining your reasoning out loud before we move on.” Over time, such personalized nudges—combined with reflection tools and feedback loops—could reinforce habits of intellectual resilience and broader cognitive skills.
A flourishing-by-design approach would require that educational AI tools be evaluated and optimized against these broader outcomes—not just narrow performance metrics. For example, does an AI homework helper improve a student’s understanding and self-confidence? Does an AI tutoring system enhance learning without diminishing curiosity or creativity? These questions elevate flourishing as a core design and accountability principle—rather than treating student well-being as an afterthought or a lucky side-effect.
To be sure, technology is icing on the cake—it is not the main course. If the institutions that lay the foundation for our economy and society, especially family and faith-based organizations, were to deteriorate further, technology will not be a panacea. But it can be an amplifier.
Why propose a new framework when there are already so many, especially following the rise of corporate social responsibility (CSR), socially responsible investing (SRI), and more recently environmental, social and governance (ESG) frameworks? Because current approaches—from tech industry self-regulation to education-specific guidelines—have clear limitations.
While well-intentioned, past frameworks often devolve into check-the-box exercises. Nearly every major company now publishes ESG or “responsible AI” reports, yet tangible changes can be elusive and ratings agencies cannot even agree on what defines a credible ESG score. Compliance-driven frameworks tend to fixate on avoiding liabilities—ensuring an algorithm does not blatantly violate a law or embarrass the company—rather than on maximizing social benefit. They also often compartmentalize issues (privacy versus innovation, bias versus efficiency), instead of seeking solutions that advance multiple values. In education, for instance, debates often pose privacy and equity in opposition, implying a trade-off between protecting student data and using data to help at-risk learners.
But this trade-off can be overcome. For example, new privacy-preserving data practices, such as secure data-sharing via cryptographic techniques, allow schools and vendors to collaborate without exposing sensitive information. Although our paper spells out more detail—and further work is surely needed—the Flourishing by Design framework is abundantly transparent: does an organization make people better off along the six dimensions of human flourishing that are present within the GFS study?
The conversation about AI in schools is at a crossroads. Up until now, much of it has oscillated between excitement over AI’s promise and alarm over its perils. What’s needed instead is a unifying vision that channels the innovation toward what truly matters. A flourishing-based model provides that north star. It does not dismiss the real warnings sounded by critics, but rather demands even greater accountability for long-term, human-centered, measurable outcomes. It also urges educators, developers, and regulators to move beyond a defensive crouch. The goal should not be to merely tame AI’s disruptions, but to shape AI in education in such a way that it helps produce healthier, wiser, more fulfilled students.
Meeting this task will require effort: new design methodologies, cross-disciplinary input, updated policy tools, and the Global Flourishing Study and associated Flourishing by Design framework are a start. If we succeed, the narrative of AI in education could shift from one of narrowly averted harms to one of empowering transformation—technology that not only respects the dignity of human beings but actively furthers their flourishing.
The GENIUS Act Ushers Stablecoins Into the Fold, and Banks Into New Competition.
This article was originally published in Real Clear Markets.
Congress redefined the playing field for digital assets. Although the recent Senate win was on party lines, the GENIUS Act promises to bring stablecoins – blockchain-based tokens pegged to fiat currency – more squarely into the regulated financial system. This legislative framework is a constructive development for the crypto sector and the banking industry alike, introducing long-awaited regulatory clarity around stablecoins and establishing clear licensing pathways for issuers and setting ground rules that could integrate stablecoins into mainstream finance. In doing so, it also forces traditional banks to face a new reality: they will have to compete more directly for deposits in a world where digital dollars provide compelling alternatives.
For years, stablecoin issuers operated in a gray area – treated as money transmitters in some states, eyed warily by federal regulators, and with uncertainty about whether tokens like USDC or USDT might be deemed securities. The GENIUS Act seeks to end this ambiguity, creating the first comprehensive U.S. regulatory framework for stablecoins by defining them as “payment stablecoins” fully backed by safe assets and giving federal agencies like the Office of the Comptroller of the Currency (OCC) clear authority to oversee them. What’s also great is that the GENIUS Act provides an alternative to Europe’s heavy-handed approach to regulating digital assets and pushing central bank digital currencies, which displaces demand for stablecoins.
Under the Act, only approved issuers could offer stablecoins, whether as banks or as special non-bank entities that obtain a federal license. While there are quibbles about the technical details on how the licenses are given out, it carves out a legal path for stablecoin providers to operate under bank-like supervision without being banks. This oversight includes requirements like 1:1 reserve backing (in cash or Treasury bills), segregated reserves, monthly audits, and strict capital and liquidity rules. Such measures aim to bolster trust in stablecoins as a safe medium of exchange, much like deposits, but in digital bearer form.
Some critics may worry that the bifurcated licensing regime could open the door to regulatory arbitrage, though the Act attempts to mitigate this by applying strict reserve, audit, and disclosure requirements across both pathways. Crucially, the Act also specifies what counts as high-quality reserves. Stablecoin issuers would be required to hold only short-term U.S. Treasury bills or equivalent safe assets against their tokens. This not only safeguards the peg (each token truly backed by $1 in liquid assets), but also pulls stablecoins into the orbit of traditional finance. Regulated stablecoins could resemble money market funds or narrow banks with their circulating tokens functioning as a new form of dollar-denominated money, which could allow issuers to become major purchasers of Treasury bills. Circle’s USDC, for example, already keeps the bulk of its $60+ billion reserve in short-term U.S. debt.
By legitimizing stablecoins, Congress is also pushing banks to become more competitive in how they operate and provide value to borrowers. Banks have long enjoyed an advantage: sticky deposits. Businesses and consumers park trillions in checking and savings accounts that pay little or no interest, providing banks with cheap funding to make loans. But stablecoins change the equation. A regulated stablecoin gives holders a digital dollar that is instantly transferable worldwide and fully backed by interest-earning assets – effectively combining the liquidity of a checking account with the yield of a T-bill investment. While some worry that deposit flight could constrain bank lending, especially for smaller institutions, the more likely scenario is a recalibration: banks may increasingly differentiate themselves through credit expertise and integrated digital services, rather than passive liquidity capture. The rise of dollar-denominated stablecoins could also advance broader U.S. objectives by accelerating de facto dollarization in some emerging markets, complicating local monetary policy and financial stability
Regulators on the Treasury’s Borrowing Advisory Committee (TBAC) have taken notice. In their Q2 2025 report a few weeks ago, the TBAC reported that stablecoin issuers already hold more than $120 billion in U.S. Treasury bills, and that continued growth of stablecoins could generate up to $900 billion in additional T-bill demand in coming years. That represents a major new source of financing for the government, as well as a corresponding outflow of funds that might previously have sat in bank deposits. While they pointed out that surging stablecoin adoption will likely come “at the expense of bank deposits” – and banks are already seeing hints of this shift as tech-savvy customers explore digital dollars for uses, ranging from cross-border remittances to parking spare cash – the long-run effects do not need to come as a substitute.
Stablecoins need not cannibalize traditional banking, as my new paper points out with digital twins. Several large banks are already exploring stablecoins that would settle on public or permissioned blockchains while remaining inside the bank’s regulatory perimeter. A tokenized deposit lets customers tap the speed and programmability of digital dollars for payroll, trade finance, or cross-border settlement, yet keeps the relationship – and the associated lending, advisory, and treasury services – anchored at the bank. Stablecoins have the potential to be complementary: stablecoins broaden the menu of money-like instruments while banks compete on service, trust, and credit expertise instead of relying on inert, zero-yield deposits alone.
Historically, such phenomena with changes in the mechanism for financing are not new, but this is the first time technology enables private dollars to move globally and near-instantly with such ease. Rather than resist this shift, the GENIUS Act leans into it, aiming to harness the innovation while corralling the risks. If implemented, it could foster a more efficient and competitive financial system. Stablecoins, with appropriate oversight, can increase competition in deposit markets and improve the efficiency of money-like instruments – making transfers faster and cheaper, and potentially widening access to dollar-based savings in communities poorly served by banks. Banks must revisit how they attract and retain customers’ funds in an era when loyalty can be withdrawn with a few clicks to a digital wallet, and the GENIUS Act is a step in the direction towards establishing the rules of the game for digital assets.
Play the Long Game With Human-AI Collaboration
This article was originally published in Gallup News.
C-suite leaders are captivated by AI’s potential to increase productivity, from generating insights to automating routine tasks. But many studies suggest technology works best when it complements human effort, not when it replaces it.
Economic research over several decades confirms that large productivity gains happen when new tools strengthen workers’ skills, judgment and creativity within supportive organizations. In a 2002 study of U.S. firms, economists found that installing information technology had little effect on productivity unless companies simultaneously reorganized work and upskilled employees. Firms that introduced computers alongside complementary changes — like decentralizing decision-making and expanding workers’ responsibilities — saw real gains, whereas those implementing tech without such organizational innovation saw negligible benefits.
“Awesome technology alone is not enough. What you really need is to update your business processes, reskill your workforce, and sometimes even change your business model and organization in a big way,” said Erik Brynjolfsson, a professor of economics and director of the Digital Economy Lab at Stanford University.
Recent research on generative AI shows this in action. One experiment found that professionals given access to ChatGPT were 37% more productive on writing tasks, with the greatest benefits for less-experienced workers. The tool handled first drafts, freeing employees to focus on higher-value editing and idea development. Instead of replacing workers, AI expanded their abilities and narrowed skill gaps. Employees also reported greater satisfaction as tedious work led to more engaging work. Other research has replicated this result on larger samples of workers in entirely different settings.
A 2021 study using Gallup data found similar benefits for employees: Technology innovation had a positive effect on worker wellbeing, but mainly for employees who say their boss creates a trusting work environment.
In contrast, pursuing AI without equal focus on people often leads to underwhelming results and added organizational risks. Still, many companies fall into this trap. Last year, Gallup found that 93% of Gallup’s CHRO roundtable members said their organizations had started using AI, but only 15% of U.S. employees said their employer had communicated a clear plan for integrating AI into their work. What slows digital transformation may not be the tools — it may be how people feel about those leading the change.
Designing a Human-AI Future
In a seminal 2024 article led by David Autor studying the composition of work activities between 1940 and 1980, new types of jobs usually appear when technology helps people work more efficiently or when there is a high demand for certain skills. As a result, the tasks that we see today do not reflect the gamut of work next year, and leaders should treat AI as a force multiplier for human ingenuity to unlock the products of the future. So how can leaders use AI to elevate what people do best? Some successful companies offer a guide:
Invest in skills, training and habits of leveling up. Beyond specific skills, organizations benefit when employees develop persistence and resilience. Other research shows that workers in jobs requiring greater intellectual tenacity — broadly referring to grit and ambition — are better protected from major disruptions, even when accounting for skill and education levels. Leaders who model and promote continuous learning and improvement help create a workplace where technology strengthens, rather than sidelines, people.
Redesign jobs and workflows to support human-AI collaboration. Simply adding AI to current processes often results in small gains. To see meaningful improvement, organizations should rethink how work is done. Let AI take on repetitive, data-heavy tasks and allow employees to focus on what people do best: creativity, complex problem-solving, interpersonal communication and nuanced decision-making. Many banks, for instance, now use AI to process paperwork for loans so that bankers can spend more time advising clients.
Foster a culture of curiosity. Make it clear that AI is there to help employees, not reduce staff. Involve workers early by asking how AI could improve their work and the business overall. When employees help shape AI tools, they are more likely to support and use them.
The success of any AI initiative depends on the people, starting with whether employees feel that management creates a trusting work environment that is safe to experiment in. AI can help with tasks and generate insights, and sometimes even play a role in validating results, but humans have a competitive edge in connecting with customers through empathy, applying creativity to shape and use AI-driven ideas, and developing the tasks of the future. While focusing on people might seem to slow initial deployment in favor of confronting structural organizational issues at first, it sets the stage for real, lasting progress.
Let’s Make NAEP a True National Yardstick for Local Autonomy
This article was originally published in The74 with Goldy Brown.
Student outcomes in K–12 education have largely stagnated over the recent decades. Despite incremental improvements in the 1990s and early 2000s, national academic performance peaked around 2013, while progress in closing achievement gaps among subgroups stalled even earlier. Recent developments at the Institute of Education Sciences, particularly the downsizing of staff for the National Assessment of Educational Progress (NAEP), create an opportunity to rethink the role this tool can play.
In particular, the Trump Administration could explore using the NAEP to promote greater transparency among schools, parents, and local communities, as well to enhance academic rigor and ensure genuine accountability in a comparable way across schools and states. That would mean replacing a disparate collection of state tests will a single national assessment administered to every fourth and eighth grade student every year.
Parents, educators, and state leaders agree that more information — not more bureaucracy — is needed to make informed decisions for their children and communities, as well as to foster greater competition. Making the NAEP a truly national assessment would provide this information in a consistent, credible, and actionable manner.
This would require a feasible restructuring of the Institute of Education Sciences (IES) to focus on the annual creation and implementation of the NAEP, in contrast to its previous biennial schedule. Additionally, states already have the infrastructure for standardized testing, as all 50 states administer various assessments.
Some adjustments might be necessary for the reformed IES, which would need to collaborate with state offices responsible for test administration to successfully implement the NAEP on an annual basis for all eligible students, not just the current sample populations. However, there are still many advantages to this approach.
First, NAEP provides a consistent and academically rigorous measure of student performance. Many states report higher proficiency rates on their own assessments than on NAEP, creating a false sense of achievement. If all fourth and eighth grade students in states that receive federal Title I funding were required to take the NAEP annually, the discrepancy between state and national standards would become harder to ignore. States would have a stronger incentive to align their instructional practices with higher expectations.
States such as Mississippi have already shown what’s possible when NAEP results are taken seriously. Mississippi’s so-called “miracle” — its leap into the top half of state rankings in 2020 and 2022—demonstrates the value of using NAEP-aligned standards as a driver for systemic change. By contrast, allowing states to accept federal funding without comparable transparency has led to low expectations and weak accountability frameworks.
Second, expanding NAEP would provide parents with a more accurate picture of how their children are performing relative to peers nationwide. Calls for greater transparency in education — amplified during and after the pandemic — have made clear that many families want more than vague reassurances from schools. A truly national assessment would offer objective, comparable data without increasing testing burdens year after year. In its current form, NAEP tests only samples of students, providing no real insight into how individual students or schools are doing.
Third, this proposal could significantly reduce unnecessary educational costs. To receive Title I funding under the Every Student Succeeds Act, states must administer annual assessments from grades 3 through 8, a requirement that consumes substantial classroom time, financial and instructional resources.
If Congress eliminated this requirement and recommended that states administer only the NAEP in fourth and eighth grades, that could facilitate more targeted transparent evaluations and reduce assessment costs for states. Additionally, standardized tests administered from grades 3 to 8 may not be necessary for improving student outcomes. A study of test scores in Texas and Nebraska showed that, on average, a student’s test scores in their first year correlated at a rate greater than 0.90 with their next year performance.
Finally, making NAEP universal would offer a balanced form of federal oversight: less intrusive than programmatic mandates, but more informative than current reporting requirements. If decentralization is the path forward for U.S. education, it must be accompanied by a shared yardstick to assess progress. A national benchmark can support local autonomy while enabling cross-district comparisons that inform parents, educators, and policymakers alike.
Federal initiatives to improve student outcomes have historically produced mixed results. The Obama-era effort to tie teacher evaluations to student performance had little impact at the national level, though districts like Dallas and Washington, D.C., saw promising gains. These cases suggest that policy tools must be both well-designed and responsive to local implementation contexts.
Designating NAEP as the national assessment meets both criteria. It would offer the federal government a low-cost, high-impact mechanism for improving transparency and setting consistent expectations without dictating how states should teach or allocate resources —it would be left up to them.
In an era of educational fragmentation, the NAEP stands out as a uniquely credible and underutilized tool. Repurposing it as the primary national assessment — administered annually to all 4th and 8th graders in states receiving Title I dollars — would promote transparency, reduce redundant testing, and align incentives around higher academic standards. This reform would offer a shared benchmark to evaluate progress across states and districts. At a time when parents, educators, and policymakers are calling for both accountability and flexibility, a restructured NAEP provides a rare opportunity to deliver both.
Occupational licensing stifles economic growth and labor market equity
This article was originally published in the San Diego Tribune.
California consistently ranks as the most regulated state in the country—getting so bad, in fact, that comedian Bill Maher recently told Gov. Gavin Newsom that he needs to “take a chainsaw” to California’s overgrown regulatory code. And while many Californians might worry about environmental, social, or safety consequences of deregulation, there is at least one category of state regulations that, with a few easily identifiable exceptions, serve only to protect special interests at the expense of the low- and middle-class wage-earners: occupational licensing regulations.
Not long ago, only a small fraction of American workers needed a license to do their jobs. Today, nearly a quarter of the workforce is subject to legally mandated licensing—everything from hair stylists to plumbers to travel guides. That explosion of regulations hasn’t just inconvenienced would-be professionals; our new research published in Humanities and Social Sciences Communications shows it’s impeding employment, especially for low- and middle-wage earners.
In a new study, we collected and analyzed data from every state to measure how occupational licensing restrictions have changed over the last few years. We used machine learning to track the exact language of state regulations—terms like “shall,” “must,” and “required”—to see where and how they apply to specific occupations. We found that occupational restrictions have nearly tripled since 2019, especially in more Democrat states, and disproportionately target lower-wage fields. In other words, the occupations that already offer relatively modest pay have seen the largest increase in barriers to entry.
To be fair, these restrictions do raise wages for those who manage to get the license—a result consistent with prior work. At first glance, that might sound like a boon to workers. But it’s also clear that when licensing rules pile up, there are fewer jobs to go around. In our study, a 10% rise in these regulatory barriers caused a 2% increase in hourly wages but a 4% drop in employment in those occupations. That’s not just a theoretical loss. It translates into fewer options for workers—particularly those hoping to enter a new field or move to another state in search of higher pay.
Because most occupational regulation comes from state and local governments, federal attempts at reform face an uphill battle. The Obama White House highlighted the risk licensing requirements pose to mobility and competition, and the Trump administration famously enacted a “1-in, 2-out” (and more recently, a “1-in, 10-out”) rule for federal regulations. But although the Trump Administration could use the bully pulpit, most of the power to reduce occupational licensing burden resides in state capitals.
Yet there’s a silver lining: Our analysis shows that even partial reforms, such as have been enacted in Idaho and Virginia, can boost hiring. If policymakers are unable to remove licensing outright—sometimes for political concerns, sometimes because of valid quality or safety concerns—they can still streamline burdensome procedures that are only tangentially related to actual skill or public safety. Lowering license fees, removing residency requirements, or granting reciprocity for out-of-state licenses are all examples of steps that reduce red tape without sacrificing meaningful standards.
Why does this matter so much for the middle class? Because rigid regulations punish those who can least afford the time and money to fulfill extra mandates. Licensing can mean months or years of courses, registration fees, and exams. For middle-income earners—or entry-level workers looking to climb the ladder—this extra friction can be insurmountable. That’s especially problematic when so many Americans need to adapt to shifts in technology and the economy.
Consider a plumber’s apprentice in Illinois or a young cosmetologist in California. With regulations layered on top of an already challenging training process, people who don’t have the cushion of savings might easily be forced to drop out of the pipeline. And if they do persevere, they’re often locked into the state where they obtained that license—relocating means requalifying, possibly at enormous personal cost. That dampens labor mobility, one of the key drivers of regional economic growth.
This is why state legislatures, especially in the heavily regulated state of California, should prioritize thorough licensing reviews. They can convene regular sunset committees to weed out outdated provisions, encourage reciprocity across state lines, and ensure that the rules are transparently linked to safety or competency concerns—not simply designed to protect entrenched interests. Such reforms would open new job opportunities and, in the long run, reduce a hidden tax on families trying to pursue better-paying work.
When the vast majority of new restrictions concentrate in occupations with modest wages, the net result is clear: Fewer hires, reduced geographic flexibility, and a steeper climb to the middle class. Shedding this regulatory baggage should be a political no-brainer—reforms could unite lawmakers who share a desire for economic growth and more equitable labor markets.
As mentioned before, occupational licensing has a place in some jobs. If you’re seeing a brain surgeon or hiring a structural engineer, there’s obviously a strong case for proven certification. But somewhere along the way, we stretched that logic to more routine jobs, imposing license burdens that can’t be justified on health or safety grounds. Rolling back these requirements would let more Americans seize opportunity—winning a much-needed victory for the low and middle classes. And between the high cost of living, high state income taxes, and the highest level of regulations in the nation, the low and middle classes in California could use any help they could get.
Christos A. Makridis is an associate research professor at Arizona State University, digital fellow at Stanford University, and a visiting faculty at University of Nicosia. Patrick A. McLaughlin is a Research Fellow at the Hoover Institution at Stanford University and a Visiting Research Fellow at the Pacific Legal Foundation. Follow him on X: @econpatrick.
America must harness stablecoins to future-proof the dollar
This article was originally published in Fortune Magazine.
With Congress just passing the federal budget, lawmakers will have an opportunity to tackle long-term financial challenges outside of crisis mode. One such challenge—and opportunity—is the rise of stablecoins: privately issued digital tokens pegged to fiat currencies like the U.S. dollar. Stablecoins have rapidly grown into a hundreds-of-billions market, facilitating billions in transactions, but they’ve lacked a comprehensive U.S. regulatory framework. Fortunately, Washington is signaling new openness to digital assets—evidenced by President Trump announcing the establishment of a strategic digital asset reserve for the nation. Creating the requisite clarity will unlock a new era of competition and innovation among banks.
Stablecoins are a strategic extension of U.S. monetary influence. Around 99% of stablecoin volume today is tied to the U.S. dollar, exporting dollar utility onto international, decentralized blockchain networks. A stablecoin market with the right guardrails can strengthen the U.S. dollar’s dominance in global finance. If people around the world can easily hold and transact in tokenized dollars, the dollar remains the go-to currency even in a digitizing economy. Recent congressional hearings echo this point—up to $5 trillion in assets could move into stablecoins and digital money by 2030, up from roughly $200 billion now. If the U.S. fails to act, it risks “becoming the rust belt of the financial industry,” as one fintech CEO warned.
Other jurisdictions aren’t standing still: Europe, the U.K., Japan, Singapore, and the UAE are developing stablecoin frameworks. Some of these could even allow new dollar-pegged tokens issued offshore—potentially eroding U.S. oversight. In short, America must lead on stablecoins or get pressured by Europe’s Digital Euro and other central bank digital currencies (CBDCs) that threaten both the private banking ecosystem and individual sovereignty in their strictest form. My research, for example, shows that CBDCs to date have not had any positive effects on growing GDP or reducing inflation, but have had negative effects on individuals’ financial well-being.
Ideally, various regulated institutions—banks, trust companies, fintech startups—could issue “tokenized dollars” under a common set of rules. Before the 1900s, state governments had the primary authority over banking. While that led to fragmentation and problems, with the right federal architecture, blockchain allows banks to offer differentiated products and a version of what existed pre-1900—their own type of stablecoin that differs in security, yield, and/or other amenities—while still keeping the value pegged to the dollar. More broadly, there is a large body of academic research showing how stablecoins drive down transaction costs, speed up settlement times, and broaden financial inclusion through new services.
In absence of federal action, we risk a patchwork of state-by-state rules or even de facto regulation by enforcement, which creates uncertainty for entrepreneurs and consumers alike. The Stablecoin Tethering and Bank Licensing Enforcement (STABLE) Act was introduced in the House in 2020, requiring any company issuing a stablecoin to obtain a bank charter and abide by bank regulations, including approval from the Federal Reserve and FDIC before launching a stablecoin, and to hold FDIC insurance or Federal Reserve deposits as reserves, making stablecoin issuers regulated like banks to protect consumers and the monetary system.
Preventing government overreach
However, as House Financial Services Committee Chairman French Hill has said, the goal should be to modernize payments and promote financial access without government overreach. Notably, Hill contrasted private-sector stablecoin innovation with the alternative “competing vision” of a government-run digital dollar (central bank digital currency) that could crowd out private innovation. And, the STABLE Act could be too draconian, penalizing non-bank entities. To that end, the recent bipartisan effort in the Senate—the Guiding and Establishing National Innovation for U.S. Stablecoins Act of 2025 (GENIUS Act)—has gained momentum.
In practice, the GENUIS Act could allow a regulated fintech or trust company to issue a dollar stablecoin under state supervision, so long as it complies with stringent requirements mirroring federal bank-like rules on liquidity and risk. This kind of flexibility, paired with robust standards, can prevent market fragmentation by bringing all credible stablecoin issuers under a regulatory “big tent.” It would also prevent any single point of failure: If one issuer falters, others operating under the same framework can pick up the slack, keeping the system stable.
Critics often voice concerns that digital currencies could enable illicit activity. But in reality, blockchain technology offers more transparency, not less, when properly leveraged. Every transaction on a public blockchain is recorded on an immutable ledger. Law enforcement has successfully traced and busted criminal networks by following the on-chain trail—something much harder to do with cash stuffed in duffel bags. In fact, blockchain’s decentralized ledger offers the potential for even greater transparency, security, and efficiency.
Following the momentum from the White House, Congress has a running start on crafting rules that bring stability and clarity to this market now that the budget has passed. Lawmakers should refine and pass a comprehensive stablecoin bill that incorporates the best of both approaches—the prudential rigor of the bank-centric model and the innovation-friendly flexibility of a dual license system. Done right, stablecoin legislation will reinforce the dollar’s role as the bedrock of global finance in the digital age, unlock new fintech innovation and competition domestically, and enhance financial integrity.
Changing Compensation Calls for Updated Social Contract
This article was originally published in Real Clear Markets.
Changing Compensation Calls for Updated Social Contract
March 13, 2025
American workers are witnessing a profound shift in how they are compensated. A century ago, a job’s pay was almost entirely a paycheck, but now nearly a third comes as benefits like health insurance, retirement plans, and stock options. Moreover, the growth in benefits is concentrated among wealthier workers, leaving the average American behind in an era of rapid technological change, according to a recent working paper I co-authored with Adam Bloomfield and Travis Cyronek.
The policy focus has recently shifted towards the American worker. “The American Dream is rooted in the concept that any citizen can achieve prosperity, upward mobility, and economic security. For too long, the designers of multilateral trade deals have lost sight of this,” said Treasury Secretary Scott Bessent in a recent talk to the Economic Club of New York. The Trump Administration has pointed out that the average American worker has borne the bulk of the burden, consistent with a large body of empirical evidence on globalization and trade. That is not to say there have not been benefits from low prices, but we need to acknowledge the costs.
Our recent working paper points out that the burden on the American worker may be even more severe than previously thought: while we often talk about wages, total compensation – which also includes benefits – tells an even tougher story for the average worker.
At the turn of the 20th century, benefits made up virtually none of a worker’s compensation, but by the late 1990s, over a quarter of a typical worker’s compensation came from non-wage benefits. Much of this transformation occurred in the post-World War II era: employer-funded “fringe” benefits soared from about 7% of worker compensation in 1947 to 18% by 1979, and now hover around a third of total compensation. While valuable to some workers, expensive benefits often go unused and can risk making workers feel locked into their jobs, while others lack even basic benefits.
On the one hand, benefits like health coverage and retirement contributions provide security and long-term value. On the other hand, many workers would prefer or urgently need higher wages instead of benefits they can’t readily spend. But our new paper shows that both wage and benefits growth for middle and low-income workers has lagged behind productivity.
Moreover, millions of low-wage workers get few or no benefits from their jobs – no health plan, no retirement account, no paid time off. As a result, the gap in overall compensation (wages + benefits) between high- and low-paid workers is even wider than the wage gap alone. A cashier or care aide might earn a bare minimum wage with no health coverage, while a higher-paid manager not only earns more per hour, but also gets thousands of dollars in insurance and pension contributions. In other words, the people who can least afford out-of-pocket medical costs are the least likely to get health coverage through their jobs.
To address the challenges posed by benefit-heavy compensation structures, we need to find ways of decoupling basic benefits from a single employer. The expansion of generative artificial intelligence has likely spurred greater self-employment, so now more than ever we need to think through ways of adapting labor market institutions to promote healthy growth.
We also need to consider how to incentivize employers extending benefits to part-time and low-wage employees. This could involve tax credits for small businesses that provide health insurance or retirement plans to lower-paid staff, or penalties for large companies that leave most workers uncovered. Or, it could involve expanding employee stock ownership plans (ESOPs) that allow employees to reap the profits of the firm so that they can make their own choice on what benefits to purchase on the open market.
The changing nature of compensation in America – from straight wages toward benefit-heavy packages – calls for an updated social contract. Without intervention, the benefits revolution will continue to bypass millions of workers, accelerating income inequality and social fragmentation. By modernizing policies to reflect how people are paid today, we can protect the dignity of work and strengthen the American workforce across all income levels.
Trump’s crypto reserve is being panned by crypto leaders. Here’s why it’s actually a good idea
This article was originally published in Fortune Magazine.
The recent announcement by the United States to establish a strategic crypto reserve, featuring Bitcoin, Ethereum, XRP, Solana (SOL), and Cardano (ADA) is a major milestone for national security and economic policy. By integrating these digital assets into a formal reserve, the U.S. not only fortifies its national security posture, but also strategically supports and leads the growth of the private digital asset market worldwide.
The announcement received criticism from some crypto leaders, such as Coinbase CEO Brian Armstrong, who had pushed for only including Bitcoin, and 8VC general partner Joe Lonsdale, a Trump supporter who argued the government should stay out of crypto. Some have also suggested that there was insider trading, but these accusations have been speculation so far. Do not forget that there is vast insider trading outside of crypto—so much that there’s even an app called Autopilot that allows retail users to replicate the trades of politicians.
We’ll get to the advantages of having a strategic reserve, but let’s pause on whether, if we have a reserve, it should just be Bitcoin. Armstrong is a laudable leader, and he makes an important point—that we should focus on Bitcoin because of its relative stability and strength. But blockchain is so much more than just Bitcoin. Other tokens have not been around as long, and thus their price volatility is greater, but that doesn’t make them any less strategic.
In fact, the newer generation of tokens often have more sophisticated consensus mechanisms and utility that they offer users, such as ETH supporting decentralized apps and XRP supporting cross-border transactions at scale. We cannot dismiss these because BTC was “first.”
Crypto reserve benefits
Let’s explore the upside of a strategic crypto reserve.
First, the establishment of a crypto reserve provides a hedge against escalating geopolitical risks. Historically, U.S. economic power has relied heavily on the dominance of the dollar, but this dominance has faced challenges—especially lately—from geopolitical rivals seeking alternative financial channels to circumvent U.S.-led financial systems and sanctions. By holding digital assets, the U.S. expands its bargaining power beyond traditional fiat currency, providing an alternative layer of economic leverage. In times of tension or uncertainty, digital assets offer resilience against targeted economic disruptions, sanctions, and currency manipulation.
Moreover, each of the chosen digital assets brings distinct strategic advantages that enhance national security infrastructure. For example, XRP is renowned for its capability to execute rapid cross-border transactions with exceptional speed and minimal transaction costs. Such capabilities are integral during times of crisis requiring immediate international monetary settlements or aid distribution. Similarly, Solana’s high-performance blockchain provides robust support for scalable and secure applications such as secure communications infrastructure or real-time monitoring of critical national assets. Cardano, known for its serious approach to governance, transparency, and security, offers additional prospects for stability and reliability.
But second, here’s a fact that might be overlooked: The formation of this crypto reserve also carries profound implications for private digital asset markets. The recent federal endorsement will serve as a powerful catalyst for market confidence and institutional adoption. Although support for digital assets has already been growing, institutional investors and major financial institutions have still hesitated to engage fully with cryptocurrencies due to regulatory uncertainty and concerns over legitimacy. The launch of an official U.S. crypto reserve sends a powerful signal: These digital assets are not only legitimate, but also strategically valuable.
The strategic crypto reserve contrasts sharply with the alternative scenario of implementing a Central Bank Digital Currency (CBDC) where digital asset management is entirely centralized under government control. Unlike a CBDC, which could displace private banks and the market for stablecoins by monopolizing digital asset flows and potentially stifle innovation through excessive centralization, the strategic crypto reserve enables the government to collaborate alongside private entities, fostering a balanced, vibrant digital asset ecosystem. Other research has also found using cross-country data that CBDCs do little to help reduce inflation or productivity, but rather reduce financial well-being, particularly among vulnerable populations.
Crypto confidence
This alternative approach will help support the growth of the private market for digital assets. In particular, startups, as well as incumbent financial institutions, can more confidently invest in infrastructure, talent acquisition, and research initiatives knowing they have clear governmental alignment. Clear governmental participation in digital asset markets can streamline regulatory processes, ensuring private entities can innovate securely within well-defined legal boundaries. Countering malicious influences in crypto means bringing more transparency to the market.
The U.S. strategic crypto reserve is a sophisticated approach that addresses both geopolitical vulnerabilities and economic innovation simultaneously. By diversifying its strategic reserves into digital asset holdings, the nation strengthens its national security by broadening economic leverage and creating an alternative financial buffer against external interference. Federal involvement also helps legitimize and invigorate the private digital asset sector, creating conditions for exponential market growth and innovation. While any action necessarily creates new risks, these can and should be managed, but we shouldn’t overlook the potential upsides.
Embracing FinTech: How CFPB Can Unlock the Future of Earned Wage Access
This article was originally published in Real Clear Markets.
The Consumer Financial Protection Bureau (CFPB) has occupied many headlines lately, but the change in leadership largely reflects a different approach to consumer empowerment than a departure in priorities. Among the many ways that the Trump Administration can improve on the status quo is the treatment of earned wage access (EWA) products by the Consumer Financial Protection Bureau (CFPB). EWA products allow employees to access a portion of their earned wages before payday, often for a small fee or free. The cost to employees is significantly lower than other options, including payday loans that often carry annual percentage rates (APRs) exceeding 300%. EWA fees typically range from $1 to $5 per transaction or are covered through alternative funding mechanisms like merchant interchange fees.
EWA providers do not charge interest, require collateral, or impose penalties for non-repayment. More importantly, because EWA draws on wages already earned, it does not create new debt obligations for workers. Some providers integrate directly with payroll systems, ensuring that any advance is automatically deducted from the employee’s next paycheck, eliminating default risk. This structure allows EWA fees to remain lower than traditional short-term credit options while offering a more transparent alternative to overdraft fees and high-cost lending.
Companies already serving consumer financial needs are well-positioned to expand into this space. Chime’s MyPay, for instance, enables consumers to access wages on their own schedule without hidden costs by connecting directly to payroll systems and leveraging merchant-funded models. Instead of employers taking the easy way out by pushing costs onto workers (i.e., “paying for their pay”), they can explore partnerships with FinTech providers and challenger banks to drive innovation in benefits delivery. This shift could not only lower costs, but also increase financial stability for employees who currently live paycheck to paycheck.
However, previous CFPB leadership made such FinTech partnerships tougher by classifying EWA programs as a type of consumer loan. That categorization imposed costly regulatory requirements under the Truth in Lending Act (TILA), treating EWA advances as if they were traditional credit products. TILA mandates extensive disclosures, compliance costs, and risk assessments that are unnecessary for a product that simply provides early access to wages. This regulatory burden raises the cost of providing EWA, forcing providers to either pass higher costs onto employees or exit the market altogether, reducing financial flexibility for workers.
With a new CFPB director expected to take a fresh look at these regulations, the opportunity exists to rethink the treatment of EWA in a way that balances consumer protection with financial innovation. There is no doubt that we need some regulations to set guardrails for markets, but the overarching concern is that we have witnessed a proliferation of regulations that do little to advance consumer safety, but instead generate unintended consequences, as my work with Alberto Rossi in 2020 has shown. Policymakers should focus on ensuring transparency and cost efficiency and allowing EWA providers to build models that eliminate fees for employees.
One such model leverages merchant interchange fees and employer partnerships to fund EWA services. When employees access their wages through an EWA-linked card, merchants pay a small fee—typically around 1%—which can be reinvested into funding wage advances. This creates a sustainable revenue stream without burdening workers with direct fees. Some fintech firms, like Chime’s MyPay, have already adopted this approach, offering free EWA services by integrating directly with payroll providers and employer benefits programs.
For employers, EWA programs also offer cost savings. Running payroll more frequently is expensive and administratively complex, and EWA provides a way to give employees financial flexibility without increasing payroll cycles. In turn, this reduces reliance on predatory payday lenders, which research has linked to higher bankruptcy rates among low-income workers.
While former director Rohit Chopra’s tenure at the CFPB has ended, the broader goal of improving financial access for workers remains. Regulatory compliance for the sake of it is empty, but fostering a financial ecosystem where innovation lowers costs for workers and expands economic opportunity should be the priority. Reclassifying EWA as something other than a loan is a first step in that process, but it reflects many more opportunities to modernize financial regulations in ways that enhance worker financial stability without stifling innovation.
The secret weapon to fixing our broken immigration system is right in front of us
This article was originally published in Fox News (with Corey DeAngelis).
Twitter/X CEO Elon Musk and entrepreneur Vivek Ramaswamy sparked a debate in December when they advocated allowing more legal immigration for high-skilled workers – for example, through H-1B visas – to make America more competitive. President-elect Donald J. Trump endorsed the policy in a statement to the New York Post shortly after the dispute broke out.
Conservatives on both sides of this discussion should be able to agree on one thing: we would not need to import as much talent if we had a more effective education system.
The latest data from the National Assessment of Educational Progress, also known as the "nation’s report card," shows that fewer than one-in-four eighth grade students are proficient in math and less than a third of them are proficient in reading. The latest international assessment shows that we’re ranked 24th in math – in the middle of the pack – despite spending nearly $20,000 per public school student each year, more than just about any other country in the world.
U.S. 4th grade math scores have fallen 18 points since 2019 – a decline larger than all but three countries: Azerbaijan, Iran, and Kazakhstan.
We can start fixing the education crisis by improving the efficiency of educational resource allocation. Mountains of empirical evidence in economics research indicate that misallocation is one of the greatest impediments to economic growth for a nation, as well as the educational services sub-sector. To that end, improving the efficiency of public education can go a long way in producing multiplier effects for a nation as a whole.
Trump appointed both Musk and Ramaswamy to head the newly formed Department of Government Efficiency (DOGE) in November. In his statement announcing DOGE’s new leaders, Trump said his administration will "dismantle Government Bureaucracy, slash excess regulations, cut wasteful expenditures, and restructure Federal Agencies."
It’s no secret waste runs rampant through our public school system. The U.S. spends over $900 billion per year on education for lackluster results. The current system is not serving the students, and makes teachers’ lives more difficult, so now is the time to start thinking about how to get bigger bang for our buck in the Department of Education. We need to inventory where current resources are going, and what outcomes they’re driving – plain and simple.
But tackling this apparent low-hanging fruit can only do so much to cut waste. After all, about 90% of all public-school funding comes from state and local sources, not the federal government.
That’s why we have to understand the root cause behind the deteriorating student outcomes. A major potential factor is administrative bloat in American education. The latest data from the National Center for Education Statistics show that student enrollment has only increased by about 5% since 2000, but the number of teachers employed by the system has grown twice as fast as students, by about 10%, over the same period. School district administrative staff has increased by about 95%, or 19 times the rate of student enrollment growth.
We’ve increased inflation-adjusted spending per student by more than 160% since 1970 and the teachers aren’t seeing the money. Teacher salaries have only increased by 3% in real terms over the same period.
The problem is that the public school system operates as a monopoly with weaker incentives to spend money wisely. But public-school unions do have a strong incentive to advocate for hiring more people, particularly in states that do not have right-to-work laws. Additional staffing means more dues-paying members and a larger voting bloc.
Our just-released study provides the first evidence that unions are driving administrative bloat in education. Using data from the National Center for Education Statistics and the American Community Survey between 2006 and 2024, we find a robust positive relationship between union density and staff-to-student ratios, and negative effects of right-to-work laws (RTW) on these ratios. These effects are largely driven by the expansion of administrative and support roles rather than teachers. Furthermore, these effects are concentrated in non-RTW states.
Specifically, we find that a 10-point increase in teachers union density is associated with a one-point increase in year-to-year staffing growth.
In Chicago, a union stronghold, staffing has increased by a whopping 20% since 2019 even though student enrollment has plunged 10%. In Texas, one of six states that outlaws collective bargaining for public employees, staffing has increased by 8% – much closer to their 2% growth in student enrollment – over the same period. Our results in the study show that these examples are not anecdotal – it’s been happening at scale.
Injecting competition into the K-12 education system would put pressure on school districts to redirect otherwise wasteful spending into the classroom. Trump can help make this happen by getting congressional Republicans in-line to pass school choice. The Educational Choice for Children Act already passed out of the House Ways and Means Committee last September and President-elect Trump said he would sign it.
Improving the efficiency of government should be a non-partisan issue, especially in a sector that hits so close to home for every American – education. It’s now up to Congress to deliver for the parents who put them in office. Allowing parents to direct the upbringing of their children is the right thing to do, but it will also make America more competitive and make education great again.
Making crypto mainstream requires greater efforts to stop fraud
This article was originally published in Cointelegraph.
We find it easy to talk about the benefits of the digital economy, whether the internet or digital assets, but the costs are often overlooked. Whether the surge in human trafficking that has emerged on social media platforms or the rise of cybersecurity vulnerabilities, the expansion of the digital economy comes with new risks to manage.
The digital asset community is no different and, to scale and become sustainable, it must confront the prevalence of fraud. And, it’s not hard: already distributed ledger technologies are demonstrating their value by solving concrete use-cases. This week in Vienna, Austria, the Austrian National Bank — together with the Complexity Science Hub and other sponsors — are hosting a conference on advances in financial technology, with a wide array of presenters who have researched value-enhancing uses of blockchain technology.
Thanks to pioneering work by the Federal Trade Commission’s Consumer Sentinel, we now have basic statistics on the incidence of fraud, the perpetrators, and the countries that exhibit the greatest violations. Using these data on complaints, Michel Grosz and Devesh Raval from the FTC show that it is possible to identify countries with excess levels of fraud based on their level of exports and to whom they are exporting. We need this caliber of data and the processes to support its collection to make strides in countering fraud.
Unfortunately, crypto does not have a great reputation on this frontier. The FTC released showing $114 million in reported fraud from Bitcoin ATMs (BTMs) in 2023 — and the number of crypto scams has surged in recent years. Of course, we need to view these statistics in perspective: fiat currencies remain the currency of choice for fraud across the world, so we should not compare the worst of crypto with the best of fiat – it’s not an apples to apples comparison. Nevertheless, we should still strive to establish the right incentives and processes within the digital asset ecosystem to counter fraud wherever possible.
Fortunately, there are already a wave of blockchain use-cases that are countering fraudulent activity. Consider, for instance, the role of financial auditing that helps ensure the integrity and transparency of organizations. Currently, auditors lack the ability to cross-check transactions between different organizations, a limitation that could lead to misreporting scandals involving millions of dollars and leads many crypto audits to be more for the show. To address this, new protocols leveraging blockchain, such as Cross Ledger cOnsistency with Smart Contracts (CLOSC) and Cross Ledger cOnsistency with Linear Combinations (CLOLC), are emerging that will enable auditors to verify cross-ledger transactions more efficiently with built-in privacy and security properties, such as transaction amount privacy and organization-auditor unlinkability.
Similarly, take scalability as another example, which is recognized as necessary for institutional adoption. Layer-2 (L2) solutions such as rollups help solve the scalability problem of L1s by handling transactions off the main blockchain and then posting the results back. However, a big concern is ensuring the security of these rollups, especially making sure that the data posted is accurate.
One recent study proposed a "watchtower" system where independent actors (watchtowers) are rewarded for keeping an eye on transactions and raising alarms when something seems wrong. These watchtowers are required to prove that they’ve been diligent in their work through a system called "proof of diligence," which ensures they’ve monitored the transactions properly. They can also challenge false data, and if they catch errors, they earn rewards. A key part of the solution is not just the technology, but also the economics of designing adequate incentives to prevent wrongdoing and promote trust.
Value-enhancing examples abound in the blockchain ecosystem, as the AFT conference in Vienna will showcase, but we need to do a better job of quantifying the benefits of real use-cases and amplifying the integral role that they play in enabling economic and social activity. Indeed, one of the greatest use-cases of blockchain technologies, drawing on its roots from cryptography, is the ability to improve security and counter malicious actors. But we need to get more serious in the way we talk about and pitch blockchain as a solution.
Christos Makridis is a guest columnist for Cointelegraph, an associate research professor at Arizona State University, an adjunct associate professor at University of Nicosia and the founder/CEO of Dainamic Banking. He holds doctoral degrees in economics and management science and engineering from Stanford University.
If your country has adopted a CBDC, you might be suffering
This article was originally published in Cointelegraph.
We’re often told that central bank digital currencies (CBDCs) will promote "financial inclusion" and help people around the globe. However, preliminary research results indicate the opposite could be true: Where CBDCs have been adopted, well-being has declined in recent years — particularly among young people and those with low incomes.
My new research paper provides the first comprehensive evaluation of their early effects on macroeconomic indicators and subjective well-being, utilizing cross-country data between 2019 and 2023. The results suggest that the benefits may be more limited than initially anticipated, coupled with potential negative effects on individual well-being and financial stability.
Limited economic benefits and unintended consequences
Data from the World Bank indicates — contrary to what you may think — higher-income countries are more likely to pilot or launch CBDCs, with these countries having, on average, five percentage points higher per capita GDP. While these countries also tend to have larger populations — largely driven by China and India — there are no significant differences in net migration rates, male unemployment rates, or urban populations.
Despite the enthusiasm surrounding CBDCs, the analysis suggested that their impact on key economic indicators — such as GDP growth and inflation — has been minimal. The study's statistical models compared countries that either piloted or launched CBDCs between 2019-23.
Recognizing that countries that pilot or launch CBDCs may be systematically different from their counterparts, I also created a "synthetic control" group that matched countries with CBDCs with others based on a nonlinear combination of controls. In other words, while there was no single control country, a combination of characteristics over each country allowed for the construction of a "synthetic control." Where possible, data was used to find how measurements within countries had changed after CBDC adoption.
The study found no evidence that CBDCs correlated with greater GDP per capita or lower inflation. These findings challenge the prevailing narrative that CBDCs are a panacea for economic challenges, particularly in low- and middle-income countries.
However, macroeconomic indicators only go so far, especially in developing countries where the data might be less reliable. Gallup and its World Poll — which is the leading source of data for constructing measures of subjective well-being across countries over time — provided the data for two additional outcomes of interest: whether an individual was thriving and their financial well-being. The former is measured based on responses to questions relating to a self-assessed ranking of current life satisfaction and expected (over the next five years) life satisfaction both on a 0-10 scale. Financial well-being is measured in response to several self-assessed questions about the ease of paying the bills and financial anxiety.
Related: How will CBDCs be used for political oppression in your country?
Gallup's data indicated that CBDCs negatively correlated with both the probability an individual was thriving and their financial well-being — a result that was concentrated among younger, lower-income populations. These groups, who are often the target audience for financial inclusion initiatives, report feeling less financially secure.
After estimating these statistical models relating well-being with CBDC adoption, country controls, and individual demographics, the data identified where the declines in well-being have been the greatest. The CBDC-interested countries with the largest declines between 2020-23 — in terms of respondents who were "thriving," according to the Gallup World Poll — were South Africa, Sweden, Thailand, and South Korea. (Sweden and South Korea have announced pilot CBDC programs, while South Africa and Thailand started developing their CBDCs in the first quarter of 2024.)
The importance of design and regulation
One of the critical challenges facing central banks is designing CBDCs that maximize benefits while minimizing risks. The risks associated with CBDCs are not trivial. They include potential financial instability through the disintermediation of banks, the erosion of privacy, and the concentration of financial power, which I’ve written about in Cointelegraph before. These risks are particularly pronounced if the central bank directly manages all aspects of the CBDC, which could undermine the traditional role of commercial banks and reduce the availability of credit, as Jesús Fernández-Villaverde and his coauthors showed in a 2021 paper.
Hybrid CBDC models could reduce some of these risks by allowing private-sector intermediaries to interact with customers while a central bank oversees the system, preserving a role for commercial banks and ensuring that CBDCs complement rather than disrupt existing financial systems. Additionally, implementing strong privacy protections and limiting the centralization of power are essential to prevent the potential misuse of CBDCs. That is in stark contrast to the way that some countries have implemented CBDCs, particularly China. However, further work is needed to assess how the architecture of the CBDC affects both economic and social outcomes — not just in theory, but very concretely.