The story

Meet the Man Who Invented Modern Retirement

Meet the Man Who Invented Modern Retirement


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Sometimes history is made by presidents, revolutionaries, artists, or groundbreaking scientists.

But at least once it was altered by a pension benefits consultant sitting at his desk in Pennsylvania studying the tax code in the late 1970s.

Today, Ted Benna is known as the “father of the 401(k),” the investment vehicle that has become the default retirement plan for 55 million people, a $5 trillion market that has, for better or worse, completely altered the way many Americans spend, save and view aging. So important is the 401(k), it has been ensnared in political debate over tax reform with President Donald Trump tweeting assurances that “there will be NO change to your 401(k).”

A 401(k) is essentially a basket of mutual funds intended to help people save for retirement. As pensions fade, and qualms about the future of social security rise, more and more Americans are relying on the 401(k) plans they typically access through their employers—and not without controversy. Advocates say that they provide a way for everyday employees to reap benefits of compound interest, the stock market’s booms, and get ahead of inflation and soaring healthcare costs.

Critics argue that they make individual investors vulnerable to twists and turns of daily stock and bond market activity. High fees have lined the pockets of asset management firms, but may leave would-be retirees poorer and with crummy returns. According to a January 2016 report from Pew Charitable Trusts, only one in five Americans believes they’ll have enough money to comfortably retire.

Today, Benna is among the loudest critics of 401(k)s. He still works as a retirement benefit consultant in Pennsylvania, and is somewhat baffled by the outcome, saying he had “no idea” his creation would balloon to what it is today.

Before Benna pitched his idea, retirement in America scarcely existed as a concept. The idea of offering financial support for the elderly began in Europe in the late 19th century when German chancellor Otto von Bismarck began offering pensions to elderly Germans who no longer worked. Prior to that, and in most places in the world at the time, if one was fortunate enough to become old, work happened until death, the elderly moved in with their offspring, or maybe a combination of the two.

In the U.S., many of those attitudes carried over from Europe and a handful of government workers or members of the military received pensions, many of which set their retirement dates to 65. By the 1920s, some large companies had followed suit. By the middle of the century, retirement culture—exemplified by timeshares in Florida, the golf industry, and AARP membership—was booming. Americans, it turned out, were pretty good at figuring out how not to do anything in their twilight years.

But yet, “the idea of saving for retirement at the time was strange,” Benna says of the 1970s and 1980s. Americans with pensions expected those pensions to cover everything.

Trouble is, many of these pension plans were (and are) underfunded. They often had terms that tethered people to jobs for decades they would have otherwise left, thus locking people into workplace misery. In some cases, men could benefit earlier than women even though both worked the same number of years. People of color or at smaller businesses were often left out of the system altogether.

“There’s a widely-held myth that we once had a wonderful retirement program where everyone got a pension and it was happily ever after,” Benna says. “But it’s just that—a huge myth.”

Benna, then working for the Pennsylvania-based Johnson Cos., had a bank as a client that was trying to create its own retirement plan with an interest in curbing the taxes senior executives paid on their bonuses. Benna knew that by deferring how cash was paid out, employees could reduce the amount of tax they paid.

While Congress had added Section 401(k) to the Internal Revenue Code, which was passed in the fall of 1978, that provision did not become effective until January 1, 1980. Benna’s innovation was adding employer matching contributions and employee pre-tax contributions, neither of which had been included when Section 401(k) was added by Congress. Whether or not those types of contributions could be included, Benna says, was left to Treasury.

At first, Benna says lawyers for the bank rejected his pitch, but his company, Johnson, went forward with it in 1981. Other companies found it to be a hard sell. When pitching the idea, Benna was met with stares and comments about how employees didn’t need to save for retirement.

“They were locked into where they were,” Benna says. The mutual fund industry was largely mom and pop operations at the time. “There was a whole industry shift, not only at the employer level, but the financial structure. A lot of it didn’t exist.”

Individual participation in the stock market through 401(k)s helped fuel the go-go days of Wall Street in the 1980s and birthed asset management juggernauts like Fidelity, Vanguard, Pimco, BlackRock, and dozens of others. By the mid-1980s, the mutual fund industry had multiplied many times over, along with the ranks of well-paid professionals on Wall Street peddling the funds and taking high management fees for doing so. More and more Fortune 500 companies began adding the plan and employees poured their assets in.

Today it’s normal for workers to be saddled with arcane mutual fund prospectuses, financial jargon, and potentially conflicted sources for advice when they sign up for their company 401(ks). But the system has also hurt them.

Throughout the Great Recession of 2008, the average 401(k) balance lost anywhere from 25 to 40 percent of value. Nobody was more harmed than baby boomers or recent retirees, who unlike younger workers, didn’t have the time for the market to rebound, or were no longer contributing and therefore unable to invest when stocks were cheap. Investors were left without money, but the money management firms still gathered their management fees.

“Too many people had the highest risk exposure during their working careers at the wrong time,” Benna says. “These people took a hit they’re never going to recover from. They’re told to hang in there and be alright, but I’ve gone through the math. They may not recover and may not be alright.”

Today, Benna, perhaps ironically, isn’t really retired. He continues to offer retirement advice to individuals and companies and is staggered by his offbeat place in financial history.

“It was never intended to be what it is today,” says Benna. “It wasn’t expected to be a big thing.”


Meet the man who helped create one of the best public pension plans in America

  • Email icon
  • Facebook icon
  • Twitter icon
  • Linkedin icon
  • Flipboard icon
  • Print icon
  • Resize icon

As 80-year old Gary Gates trundles along the streets of Wisconsin’s capital on a bike half his age, he stops at dumpsters to pick up cans.

It’s a habit the former head of Wisconsin’s public pension fund has kept up for decades. In the first year of his retirement, 30 years ago, Gates collected and recycled around 250,000 cans — a record haul he ascribed to his need to keep his hands busy. Since he was a child at the beach picking up refuse left by other vacationers, he said it has been a perennial itch.

These traits — being disciplined and public-minded — encapsulate Wisconsin’s approach to managing the pensions of government employees, and helps explain why Gates was one of the people who helped design the Wisconsin pension fund in its modern form. He and others turned it into the home of the only state pension plan capable of fully delivering on its retirement pledges to schoolteachers, firefighters and other government employees.

“Wisconsin by the numbers is among the five best funded systems in the country, but what sets it apart even further is just how thoughtful the state has been on paying what has come due, and in laying out policies that plan for the future. In that regard, it’s as good as it gets for a state-run public pension system,” said Greg Mennis, director of public sector retirement systems project at the Pew Charitable Trusts.

For pension experts, there is plenty to learn from Wisconsin, especially when a decadelong bull market and an extended economic expansion has yet to replenish the funding levels of state retirement plans left devastated by the global financial crisis of 2008.

Analysts say poorly funded public pension systems in Illinois, Connecticut and New Jersey are ill-prepared to secure the financial well being of retirees in part because inadequate contributions and over-optimistic expected returns have allowed their obligations to outrun the growth of their investments.

In contrast, the assets in Wisconsin’s public pension plan amounted to $104 billion in 2017, leaving it 100% funded, a record it has roughly kept since 2004. In other words, the current value of the plan’s investments could pay down nearly all of the retirement benefits of its members. This compares with the average funding level of 73% for the year across all states, according to the Public Plans Database.

And it hasn’t cost local taxpayers much to keep Wisconsin’s pension fund topped up. In 2016, Wisconsin’s government spent around 2.13% of the state’s budget on their public pension fund, much less than the average contribution rate of 4.74% in other states.

According to Gates, Wisconsin’s enviable state of affairs was the result of long-term planning, an unusually public-minded political climate, and the cooperation of many stakeholders.

This favorable backdrop enabled Wisconsin legislators to come up with the steady contributions needed to top up the pension plan and become one of the first states to formally create a shared-risk model that would spread out the risk of financial losses across taxpayers and retirees, an idea that Gates concedes he had helped create.

Avoiding Temptation

Wisconsin’s continued success is notable because higher funding levels didn’t tempt state lawmakers to skimp on pension payments to make more space in the budget for their own policies.

Some states, like California, have failed to add to their public pension funds, often relying on unrealistic returns in financial markets to make up for the shortfall, and to artificially inflate their funding status.

The Society of Actuaries said in a February 2019 report that most municipal and state pension plans did not receive enough contributions from local government budgets to reduce their level of unfunded liabilities.

In that regard, Wisconsin’s discipline in making its steady contributions even during Wall Street’s lean years has played a key factor in its success.

“For the most part, pension underfunding isn’t about investment losses. It’s mostly about the state making payments, as scheduled, into the pension fund,” said Matt Fabian, a municipal bond analyst at Municipal Market Analytics.

There have been momentary hiccups. In 1987, Republican Gov. Tommy Thompson’s administration and Democratic lawmakers raided the public pension fund, only for the Wisconsin Supreme Court to order them to return the funds a decade later.

Yet Wisconsin’s pension fund has mostly managed to stick to its payment schedule.

“It’s one of the feathers in Wisconsin’s cap,” said Bob Lang, director of Wisconsin’s nonpartisan Legislative Fiscal Bureau, who worked with Gates when he was still running the Employee Trust Funds.

It’s an approach Gates follows in his own personal finances. His pension payments amounted to less than half of his annual income as he had put a large chunk of his salary into tax-deferred annuities, individual retirement accounts, and deferred compensation programs. That still left him one-third to donate to charity.

Gary (right) with his brother Bill.

Gates said he’s still not sure why Wisconsin was able to stay on task. But he suspects it has something to do with the strong spirit of public service undergirding Wisconsin politics since the late 19th century, pushing lawmakers in the state house to think long-term.

“Regardless of political persuasion, regardless of who’s been on the governor’s office for many, many decades, Wisconsin’s legislatures and governors have made sure the commitments to contributions to the pension fund were met,” said Lang.

Moreover, the Badger State, as Wisconsin is called, has a long history of partnerships between governments and academics as part of the so-called Wisconsin Idea, which ensured local politicians were well-advised of the benefits of staying on schedule.

“There just happened to be a bunch of people committed to the idea. I happened to be a beneficiary. The concept of long-term funding was all there,” Gates said.

It’s perhaps why for someone who showed the sharp attention to detail needed to help come up with the risk-sharing mechanism, Gates doesn’t think technical solutions are as important as the willingness of policy makers to pursue the long-term good.

“The orientation today is much more about what can I do tomorrow, so I benefit right now. The people at the time had a more long-term orientation. How do you develop that, I’m not sure,” he said.

At least for Gates, his wish was to leave things better than others had left it.

Trailblazer of the shared-risk model

At the center of Wisconsin’s pension system is a shared-risk model, a hybrid between defined-benefit plans that leave state taxpayers saddled with the risk of investment losses and defined-contribution plans like a 401(k) that make workers ultimately responsible for their retirement benefits.

One of the first to introduce the idea, Wisconsin pushed toward a risk-sharing public retirement system in the 1970s when the state’s existing mishmash of pension plans was consolidated into a single entity now known as Department of Employee Trust Funds.

Ahead of the merger, Gates, then the deputy head of Wisconsin’s pension plan, along with Max Sullivan, the former head, came up with its risk-sharing mechanism together, according to the Milwaukee Journal Sentinel.

This is how it works: On top of a minimum payment based on an employee’s years of service and the salary reached at retirement, the pension plan gives an additional dividend to individuals that automatically waxes and wanes every year based on how its assets perform.

Much like a “Swiss watch that winds itself”, the Employee Trust Funds corrected itself whenever it was off-balance, according to Jon Stein, a reporter at the Milwaukee Journal Sentinel.

If the plan’s assets exceed the 5% average expected rate of return over a five-year stretch, members of the Employee Trust Funds receive an additional annuity, based on the level of outperformance. But when the pension fund’s assets deliver returns below this floor over the five years, the state government has to make up for the shortfall.

Yet Wisconsin taxpayers haven’t had to worry about taking up the slack even during the financial crisis.

Wisconsin’s pension fund didn’t fall below this 5% average return. In all of the five years after the 2007-2009 recession, the annuities steadily verged ever closer to the minimum floor, only for the bull market in equities to drag average investment returns back higher.

In other words, the pension plan would gradually cut back on these dividends unbidden without the input of lawmakers or state officials, instead of keeping up the same level of pension payments during bad years for financial markets and jeopardize the retirement system’s health.

Shared-risk features have now become increasingly adopted across state pension funds since 2008 more than half of the states in the U.S. have now moved to implement some sort of risk-sharing element in their pension plans from a handful before the financial crisis, according to the National Association of State Retirement Administrators, or NASRA.

Though shared-risk plans can come in different flavors, they all try to achieve the same goal: ensuring the financial losses that befall a public pension plan are spread out among taxpayers and retirees.

Take Maryland. As long as the pension fund’s assets exceed their expected return, retirees receive an additional fixed 2.5% annuity to offset inflation. But when the fund underperforms, this annuity is capped at 1%.

“The fundamental purpose behind these risk sharing plan features is to create some relief valve or some self-adjusting mechanism to financial and actuarial events,” said Keith Brainard, research director for NASRA.

This means instead of discussing how a pension plan deals with a funding deficit after the market slumps, when the political will to maintain contributions or cut retirement benefits is weak, states can rely on a course of action that has been laid out far in advance.

Brainard said there hasn’t been enough time to assess how much funding levels improve when a pension plan introduces a shared-risk model, a relatively new feature of the public pension landscape.

Still, he says it’s not an accident that two of the U.S.’s only fully-funded state pension systems — South Dakota and Wisconsin — have risk-sharing features.

That’s something Gates helped pioneer with Wisconsin’s pension fund.

Before Gates retired, he said he wanted this inscription on his tombstone — “He kept the trust.”


JEAN MARIUS | THE MAN WHO INVENTED THE MODERN UMBRELLA

Today we're taking you to France in the early 1700's to meet a master craftsman who was to change to history of the umbrella forever. It was the reign of Louis XIV, umbrellas existed, but were heavy and awkward and therefore were not in general use.

Monsieur Jean Marius, a master purse maker from the Paris barrier of Saint-Honoré, noted that on rainy days, the fashionable wigs of many female customers visiting his shop would be ruined. He realised that umbrellas were the answer, but that in their current form fashionable ladies would not dare to be seen carrying them when not in use.

After many attempts, in 1709 Marius created a pocket parasol. It weighed less than one kilogram and had folding ribs so it could be folded up and stored in a sheath like a modern umbrella. It also had a jointed shaft, that could be dismantled into three sections, making it small enough to be discreetly carried.

Critically as a purse maker he understood that to gain acceptance his umbrella needed to be elegant as well as practical, so he introduced beautiful colours to complement a ladies outfit and fancy edging material.

Louis XIV, an avid wearer of elaborate wigs, immediately appreciated the invention and awarded Marius a royal privilege. This meant that every such umbrella made in France for the next five years had to carry his trademark.

In a letter of 18 June 1712 the Princess Palatine mentioned Marius's invention, the 'expeditious parasol-umbrella that can be carried everywhere, in case you are caught in the rain while out walking '. She enthused about it to her aristocratic friends, and soon every sophisticated Parisian was seen carrying a chic parapluie .

So next time you feel that first spot of rain and reach for your trusty compact umbrella, spare a thought to the man whose persistence is about to save you from getting wet.


An Inside Job

Pressures both internal and external, however, would soon put an end to the order’s expansion into the upper echelons of Bavarian power. Weishaupt and Knigge increasingly fought over the aims and procedures of the order, a conflict that, in the end, forced Knigge to leave the society. At the same time, another ex-member, Joseph Utzschneider, wrote a letter to the Grand Duchess of Bavaria, supposedly lifting the lid on this most secret of societies.

The revelations were a mix of truth and lies. According to Utzschneider, the Illuminati believed that suicide was legitimate, that its enemies should be poisoned, and that religion was an absurdity. He also suggested that the Illuminati were conspiring against Bavaria on behalf of Austria. Having been warned by his wife, the Duke-Elector of Bavaria issued an edict in June 1784 banning the creation of any kind of society not previously authorized by law.

The Illuminati initially thought that this general prohibition would not directly affect them. But just under a year later, in March 1785, the Bavarian sovereign passed a second edict, which expressly banned the order. In the course of carrying out arrests of members, Bavarian police found highly compromising documents, including a defense of suicide and atheism, a plan to create a female branch of the order, invisible ink recipes, and medical instructions for carrying out abortions. The evidence was used as the basis for accusing the order of conspiring against religion and the state. In August 1787, the duke-elector issued a third edict in which he confirmed that the order was prohibited, and imposed the death penalty for membership.

Weishaupt lost his post at the University of Ingolstadt and was banished. He lived the rest of his life in Gotha in Saxony where he taught philosophy at the University of Göttingen. The Bavarian state considered the Illuminati dismantled.

Their legacy, however, has endured and fuels many conspiracy theories. Weishaupt was accused—falsely—of helping to plot the French Revolution. The Illuminati have been fingered in recent events, such as the assassination of John F. Kennedy. Weishaupt’s ideas have also influenced the realms of popular fiction, such as Dan Brown’s Angels & Demons and Foucault’s Pendulum by Italian novelist Umberto Eco. Although his group was disbanded, Weishaupt’s lasting contribution may be the idea that secret societies linger behind the scenes, pulling the levers of power.

The Ascent to Illumination

First Class
Each novice was initiated in humanitarian philosophy until he became a minerval. He then received the order’s statutes and could attend meetings.

1. Initiate
2. Novice
3. Minerval
4. Illuminatus Minor

Second Class
The various degrees in this class were inspired by Freemasonry. The illuminatus major supervised recruitment, and the illuminatus dirigens presided over the minervals’ meetings.

5. Apprentice
6. Fellow
7. Master
8. Illuminatus Major
9. Illuminatus Dirigens

Third Class
The highest degree of philosophical illumination. Its members were priests who instructed lower-degree members. The lower orders of this class were themselves under the authority of a king.


MEET THE MAN WHO MADE MIAMI MODERN

When Norman Giller became an architect in 1944, he never meant to make history. He never expected to see his buildings commemorated in an exhibition, never thought his designs would be described by a name that sums up not just a style of architecture but a uniquely South Floridian vision of life.

It has come to be known as Miami Modern, and its extroverted extravagance symbolizes an entire period. There were others who designed bold, sweeping buildings for a bold, sweeping age, but Giller is the last major figure of an era of architectural exuberance that has few parallels in our history. He is the grand master of South Florida architecture, and at long last he is getting his due.

"He was a big part of creating the style of Miami Modern," says Randall Robinson, a planner with the Miami Beach Community Development Corporation and a leading preservationist of South Florida architecture. "He was enormously important."

On Tuesday, Robinson will lead "A Conversation with Norman Giller" at the Seymour, a converted Art Deco apartment building on South Beach. Photographs of some of his buildings, along with others in the Miami Modern movement, are on display at the Seymour through Dec. 16. On Nov. 12, Giller will be honored with a gala luncheon at the Eden Roc Hotel -- a splendid example of Miami Modern by Morris Lapidus -- in Miami Beach.

Giller doesn't wear a cape, like Frank Lloyd Wright he doesn't pretend to change the world through architecture, like Mies van der Rohe and he doesn't dwell theatrically on perceived slights, like Lapidus, his friend and longtime Miami Beach rival. All Giller has done is design one building after another, and from time to time the lines on his blueprints became magic.

"Architecture is more than drawing a pretty picture," he says. "Architecture is an art and a science and a business."

It also reflects the aspirations of its time. When Miami Modern -- MiMo for short -- flourished, from 1945 until the early 1970s, South Florida was a place of whimsy, effervescence and hope.

"Modern and fun at the same time" is how Robinson, who coined the term, defines MiMo. "Modern architecture is supposed to be quite spare and serious. But here in South Florida it had to have an element of fun about it."

Buildings often had flamboyant shapes, with strong horizontal lines, wide overhangs and cantilevered walkways -- which Giller invented, by the way. They borrowed from the cars and coffee tables of the time, with wild boomerang-shaped awnings, curving walls, a lot of glass, aluminum and poured concrete and a gravity-defying, space-age exuberance. Architects reshaped the square glass boxes of Mies, which dominated mid-century architecture everywhere else, yet many of the most noteworthy buildings were small and utilitarian.

With the Ocean Palm, shabby but still standing on Collins Avenue in Sunny Isles, Giller designed the world's first two-story motel in 1950. He was a pioneer in using air conditioning, PVC pipe and other technological advances. He designed two of the most majestic hotels of South Florida luxe, the Carillon in Miami Beach and the original Diplomat in Hollywood. In the late 1950s, Giller had the 10th-largest architectural firm in the United States. He had six offices in Central and South America, traveled throughout Canada and Europe and worked from Pensacola to Key West in Florida. By his own count, he designed 11,000 buildings in all, worth more than $100 million.

Norman Giller has been in Miami Beach since 1929, but he he still retains the soft, unhurried drawl of his native Jacksonville. He and his family lived at the southern tip of South Beach because, as he says calmly, "Prior to 1935, Jewish people could not live north of Fifth Street."

He is now 83 years old and still goes to work every day at Giller and Giller on the top floor of the Giller Building on Arthur Godfrey Road, 36 blocks north of Fifth Street. His son Ira is the current president of the firm and has been guided by his father's example his whole life.

"We would drive up Collins Avenue," Ira recalls, "and one of the sports for three young kids was to pick out which buildings Daddy designed."

They have worked side by side for 30 years.

"I've described him as a Renaissance man," says Ira. "He's a very multifaceted architect, but his interests go beyond architecture."

Giller is the founding president of a national bank (now part of Jefferson Bank) he has written two books, one about his career, the other a genealogical study of his family he helped integrate the Boy Scouts in Miami he was a founder of the Sanford L. Ziff Jewish Museum in Miami Beach, of which he is chairman of the board and his company renovated an old synagogue to house the museum.

Before joining the Navy during World War II, Giller worked for the Army Corps of Engineers, where he laid out 75 airports in Florida, including Miami International. He designed military bases with curvilinear roads and camouflaged buildings. A base in Boca Raton became the campus of Florida Atlantic University, for which Giller later designed several buildings. After the war, he came home to Miami Beach and rode the long postwar boom.

"It was a good time," he says. "We were building everything -- apartment houses, private homes, stores."

And, of course, there were the hotels. After the long privations of the Depression and World War II, South Florida and Las Vegas emerged as the nation's playgrounds of unabashed self-indulgence.

"Would you care to guess how many large hotels were built in the 1950s?" asks Don Worth, vice chair of the Urban Arts Committee, which is organizing the Giller celebration. "They're common around here, but everywhere else outside of Las Vegas and Miami Beach, only eight large hotels were built in the United States."

Every year through the 1950s, a new hotel went up along Miami Beach, each more glamorous and ostentatious than the last. Lapidus was building the Fontainebleau, Eden Roc and Americana, while Giller was designing the Singapore, the Carillon and, in distant, undeveloped Hollywood, the Diplomat.

"The ones he's most proud of," says Ira Giller, "are the Diplomat and the Carillon. The Diplomat was the Fontainebleau of Broward County."

Three years ago, the 18-story Diplomat was torn down to make way for a taller, more expensive replacement.

"I was invited to watch my building being destroyed," says Giller. "I said, 'No, thank you.'"

When the Carillon opened in 1957, it was the second largest hotel on Miami Beach, next to Lapidus' masterpiece, the Fontainebleau. Occupying a full block on Collins Avenue between 68th and 69th streets, it presented a different face from every side. It had walls of glass and aluminum, with a ballroom topped by a parapet of concrete folded like an accordion.

Vacant for years, it is still standing, even though its current owner has done nothing to restore it and has hinted that it might soon be demolished. Because it lies outside the official Art Deco district, the Carillon is not subject to the strict historic preservation laws of Miami Beach.

"Its life definitely hangs in the balance," says preservation expert Randall Robinson. "The Carillon is the high point of Miami Beach modern architecture. It is to Miami Modern what the Chrysler building is to New York Deco."

Demoliton is a cold fact of architecture that Giller stoically accepts.

"It does bother you," he says, though he refuses to romanticize his own work. But he knows better than anyone that under the concrete and steel, it's the touch of the human mind and heart that give a building its soul.


Meet the Man Who Invented the Instructions for the Internet

To revist this article, visit My Profile, then View saved stories.

To revist this article, visit My Profile, then View saved stories.

Steve Crocker was there when the internet was born. The date was Oct. 29, 1969, and the place was the University of California, Los Angeles. Crocker was among a small group of UCLA researchers who sent the first message between the first two nodes of the ARPAnet, the U.S. Department of Defense–funded network that eventually morphed into the modern internet.

Crocker's biggest contribution to the project was the creation of the Request for Comments, or RFC. Shared among the various research institutions building the ARPAnet, these were documents that sought to describe how this massive network would work, and they were essential to its evolution -- so essential, they're still used today.

Like the RFCs, Crocker is still a vital part of the modern internet. He's the chairman of the board of ICANN, the organization which operates the internet's domain naming system, following in the footsteps of his old high school and UCLA buddy Vint Cerf. And like Cerf, Crocker is part of the inaugural class inducted into the Internet Society's (ISOC) Hall of Fame.

This week, he spoke with Wired about the first internet transmission, the creation of the RFCs, and their place in history. 'RFC' is now included in the Oxford English Dictionary. And so is Steve Crocker.

Wired: Some say the internet was born on Oct. 29, 1969, when the first message was sent between UCLA and the Stanford Research Institute (SRI). But others say it actually arrived a few weeks earlier, when UCLA set up its ARPAnet machines. You were there. Which is it?

Steve Crocker: October. The very first attempt to get some communication between our machine, a Sigma 7, and [Douglas] Engelbart's machine, an SDS-940, at SRI.

We tried to log in [to the SRI machine]. We had a very simple terminal protocol so that you could act like you were a terminal at our end and log in to their machine. But the software had a small bug in it. We sent the 'l' and the 'o,' but the 'g' caused a crash.

Their system had the sophistication that if you started typing a command and you got to the point where there was no other possibility, it would finish the command for you. So when you typed 'l-o-g,' it would respond with the full word: 'l-o-g-i-n.' But the software that we had ginned up wasn't expecting more than one character to ever come back. The 'l' was typed, and we got an 'l' back. The 'o' was typed, and we got an 'o' back. But the 'g' was typed, and it wasn't expecting the 'g-i-n.' A simple problem. Easily fixed.

Wired: And the internet was born?

Crocker: Some say that this was a single network and therefore not 'the internet.' The ARPAnet was all one kind of router, and it didn't interconnect with other networks. Some people say that the internet was created when multiple networks were connected to each other -- that the IP [internet protocol] and TCP [transmission control protocol] work on top of that were instrumental in creating the internet.

The people who worked at that layer, particularly Vint Cerf and Bob Kahn [the inventors of IP and TCP], tend to make a careful distinction between the ARPAnet and the later expansion into multiple networks, and they mark the birth of the internet from that later point.

But, conversely, the basic design of protocol layers and documentation and much of the upper structure was done as part of the ARPAnet and continued without much modification as the internet came into being. So, from the user point of view, Telnet, FTP, and e-mail and so forth were all born early on, on the ARPAnet, and from that point of view, the expansion to the internet was close to seamless. You can mark the birth of internet back to the ARPAnet.

Wired: It was before that first ARPAnet transmission that you started the Requests for Comments. They helped make that transmission possible?

Crocker: The people at ARPA [the Department of Defense's Advanced Research Projects Agency, later called DARPA] had a formal contract with Bolt, Beranek, and Newman [or BBN, a Boston-based government contractor] for the creation of the routers, and they had a formal contract with AT&T for the leased lines that would carry the bits between the routers, across the country. But they had no formal plan, or formal paper work, for the nodes that would be connected to the network.

What they had instead was a captive set of research operations that they were already funding. The first four [nodes on the ARPAnet: UCLA, SRI, University of California, Santa Barbara, and the University of Utah] and all of the other places that would play a part in those early days were places that were already doing research with ARPA money.

These were pre-existing projects of one sort of another. Graphics. Artificial intelligence. Machine architectures. Big database machines. All the key topics of the day. Douglas Engelbart's work at SRI was focused on human-machine interaction. He had an early version of a mouse and hypertext working in his laboratory, for example.

So, the heads of each of these projects were busy with their own agendas, and here comes this network -- which was kind of foisted on them, in a way. Not unwillingly, but not with any kind of formality either. So, basically, they delegated the attention to this project down to the next level. In the case of the university projects, that meant graduate students, and in the case of SRI, that meant staff members below the principal investigator level.

Somebody called a meeting in August of ❨, and a few of us came from each of these places . on the order of a dozen or fewer people. Vint and I drove up from L.A. to Santa Barbara, where the meeting was held and met our counterparts. And the main thing that happened was that we realized we were asking the same questions and that we had some commonality in our technical backgrounds and our sense of what should be done -- but there wasn't a lot of definition to it.

So we made one of the more important decisions, which was to go visit each other's laboratories and to keep talking to each other. And we understood the irony that this network was supposed to reduce travel and the first thing we did was increase travel.

"Late one night, I couldn't sleep, and the only place I could work without waking people up was in the bathroom. It was 3 a.m., and I scribbled down some rules for these notes"

Over the next several months, from August ❨ to spring ❩, we had a series of meetings where we visited each other's labs, and we also had kind of freeform discussions on what we might do with this network -- how it might develop. We didn't have a detailed specification of how the IMPs [interface message processors] were going to be connected to the hosts.

When we started, BBN hadn't actually been selected. I think they were selected and got started on the first of January 1969. Some us went out to meet them in Boston in the middle of February 1969, in the middle of a large snow storm. But they didn't publish a detailed specification of how you connect a host to an IMP until later that spring. So [the researchers] had this time from when we first met each other to the time we had a detailed spec in which we could speculate and focus on the larger issues without having to narrow down into 'this bit has to go here' and 'this wire has to go there,' and we started to sketch out some key ideas.

There was no senior leadership. There were no professors. There was no adult in the room, as it were. We were all more or less in our mid-20s and self-organized. Out of that emerged . a strong sense that we couldn't nail down everything. We had to be very ginger about what we specified and leave others to build on top of it. So we tried to focus on an architecture that had very thin layers that you could build on top of -- or go around.


A few tips

Benna praises the 401(k)’s ability to turn spenders into savers. “It turns spenders into savers by making saving the first priority. And most of us, including me at the time, would never have accumulated what you do with a 401(k), you know, if you had to do it on your own every paycheck.“

Yet roughly 50% of the working population in the U.S. lacks access to an employer-based retirement plan. The Center for Retirement Research says, “Only about half of workers, at any moment in time, participate in either a defined benefit plan or a 401(k) plan.”

It’s one of the reasons Benna advocates for new laws that would require employees be automatically enrolled in retirement savings plans. “That would ratchet up the participation level for those who don’t have plans,” he predicted.

Benna is also calling for a new mandate to require all employers offer, “some form of payroll deduction savings arrangement to help their employees save for retirement.”

Benna says small businesses, those with 100 or fewer employees, could create low-fee retirement plans.

“It's a matter of small businesses realizing that there are great opportunities other than 401(k),” he said. “You know, [for] many of the small employers, 401(k) is not the right answer, and they legally have other alternatives.”

One of them is the Simple IRA which allows employees and employers to contribute, through payroll deductions, to traditional IRAs. “It is ideally suited as a startup retirement savings plan for small employers not currently sponsoring a retirement plan,” according to the IRS.


Contents

Dennis Ritchie was born in Bronxville, New York. His father was Alistair E. Ritchie, a longtime Bell Labs scientist and co-author of The Design of Switching Circuits [7] on switching circuit theory. [8] As a child, Dennis moved with his family to Summit, New Jersey, where he graduated from Summit High School. [9] He graduated from Harvard University with degrees in physics and applied mathematics. [8]

In 1967, Ritchie began working at the Bell Labs Computing Sciences Research Center, and in 1968, he defended his PhD thesis on "Computational Complexity and Program Structure" at Harvard under the supervision of Patrick C. Fischer. However, Ritchie never officially received his PhD degree as he did not submit a bound copy of his dissertation to the Harvard library, a requirement for the degree. [10] [11] In 2020, the Computer History museum worked with Ritchie's family and Fischer's family and found a copy of the lost dissertation. [11]

During the 1960s, Ritchie and Ken Thompson worked on the Multics operating system at Bell Labs. Thompson then found an old PDP-7 machine and developed his own application programs and operating system from scratch, aided by Ritchie and others. In 1970, Brian Kernighan suggested the name "Unix", a pun on the name "Multics". [12] To supplement assembly language with a system-level programming language, Thompson created B. Later, B was replaced by C, created by Ritchie, who continued to contribute to the development of Unix and C for many years. [13]

During the 1970s, Ritchie collaborated with James Reeds and Robert Morris on a ciphertext-only attack on the M-209 US cipher machine that could solve messages of at least 2000–2500 letters. [14] Ritchie relates that, after discussions with the NSA, the authors decided not to publish it, as they were told that the principle was applicable to machines still in use by foreign governments. [14]

Ritchie was also involved with the development of the Plan 9 and Inferno operating systems, and the programming language Limbo.

As part of an AT&T restructuring in the mid-1990s, Ritchie was transferred to Lucent Technologies, where he retired in 2007 as head of System Software Research Department. [15]

Ritchie is best known as the creator of the C programming language, a key developer of the Unix operating system, and co-author of the book The C Programming Language he was the 'R' in K&R (a common reference to the book's authors Kernighan and Ritchie). Ritchie worked together with Ken Thompson, who is credited with writing the original version of Unix one of Ritchie's most important contributions to Unix was its porting to different machines and platforms. [16] They were so influential on Research Unix that Doug McIlroy later wrote, "The names of Ritchie and Thompson may safely be assumed to be attached to almost everything not otherwise attributed." [17]

Ritchie liked to emphasize that he was just one member of a group. He suggested that many of the improvements he introduced simply "looked like a good thing to do," and that anyone else in the same place at the same time might have done the same thing.

Nowadays, the C language is widely used in application, operating system, and embedded system development, and its influence is seen in most modern programming languages. C fundamentally changed the way computer programs were written. [ citation needed ] For the first time [ citation needed ] C enabled the same program to work on different machines. Modern software [ which? ] is written using one of C's more evolved dialects. [ citation needed ] Apple has used Objective-C in macOS (derived from NeXTSTEP) and Microsoft uses C#, and Java is used by Android. Ritchie and Thompson used C to write UNIX. Unix has been influential establishing computing concepts and principles that have been widely adopted.

In an interview from 1999, Ritchie clarified that he saw Linux and BSD operating systems as a continuation of the basis of the Unix operating system, and as derivatives of Unix: [18]

I think the Linux phenomenon is quite delightful, because it draws so strongly on the basis that Unix provided. Linux seems to be among the healthiest of the direct Unix derivatives, though there are also the various BSD systems as well as the more official offerings from the workstation and mainframe manufacturers.

In the same interview, he stated that he viewed both Unix and Linux as "the continuation of ideas that were started by Ken and me and many others, many years ago." [18]

In 1983, Ritchie and Thompson received the Turing Award "for their development of generic operating systems theory and specifically for the implementation of the UNIX operating system". [19] Ritchie's Turing Award lecture was titled "Reflections on Software Research". [20] In 1990, both Ritchie and Thompson received the IEEE Richard W. Hamming Medal from the Institute of Electrical and Electronics Engineers (IEEE), "for the origination of the UNIX operating system and the C programming language". [21]

In 1997, both Ritchie and Thompson were made Fellows of the Computer History Museum, "for co-creation of the UNIX operating system, and for development of the C programming language." [22]

On April 21, 1999, Thompson and Ritchie jointly received the National Medal of Technology of 1998 from President Bill Clinton for co-inventing the UNIX operating system and the C programming language which, according to the citation for the medal, "led to enormous advances in computer hardware, software, and networking systems and stimulated growth of an entire industry, thereby enhancing American leadership in the Information Age". [23] [24]

In 2005, the Industrial Research Institute awarded Ritchie its Achievement Award in recognition of his contribution to science and technology, and to society generally, with his development of the Unix operating system. [25]

In 2011, Ritchie, along with Thompson, was awarded the Japan Prize for Information and Communications for his work in the development of the Unix operating system. [26]

Ritchie was found dead on October 12, 2011, at the age of 70 at his home in Berkeley Heights, New Jersey, where he lived alone. [1] First news of his death came from his former colleague, Rob Pike. [2] [3] The cause and exact time of death have not been disclosed. [27] He had been in frail health for several years following treatment for prostate cancer and heart disease. [1] [2] [28] [29] News of Ritchie's death was largely overshadowed by the media coverage of the death of Apple co-founder Steve Jobs, which occurred the week before. [30]

Following Ritchie's death, computer historian Paul E. Ceruzzi stated: [31]

Ritchie was under the radar. His name was not a household name at all, but. if you had a microscope and could look in a computer, you'd see his work everywhere inside.

In an interview shortly after Ritchie's death, long time colleague Brian Kernighan said Ritchie never expected C to be so significant. [32] Kernighan told The New York Times "The tools that Dennis built—and their direct descendants—run pretty much everything today." [33] Kernighan reminded readers of how important a role C and Unix had played in the development of later high-profile projects, such as the iPhone. [34] [35] Other testimonials to his influence followed. [36] [37] [38] [39]

Reflecting upon his death, a commentator compared the relative importance of Steve Jobs and Ritchie, concluding that "[Ritchie's] work played a key role in spawning the technological revolution of the last forty years—including technology on which Apple went on to build its fortune." [40] Another commentator said, "Ritchie, on the other hand, invented and co-invented two key software technologies which make up the DNA of effectively every single computer software product we use directly or even indirectly in the modern age. It sounds like a wild claim, but it really is true." [41] Another said, "many in computer science and related fields knew of Ritchie’s importance to the growth and development of, well, everything to do with computing. " [42]

The Fedora 16 Linux distribution, which was released about a month after he died, was dedicated to his memory. [43] FreeBSD 9.0, released January 12, 2012 was also dedicated in his memory. [44]

Asteroid 294727 Dennisritchie, discovered by astronomers Tom Glinos and David H. Levy in 2008, was named in his memory. [45] The official naming citation was published by the Minor Planet Center on 7 February 2012 ( M.P.C. 78272 ). [46]

Ritchie engaged in conversation in a chalet in the mountains surrounding Salt Lake City at the 1984 Usenix conference.

At the same Usenix 1984 conference, Dennis Ritchie is visible in the middle, wearing a striped sweater, behind Steven Bellovin wearing a baseball cap.

Ritchie has been the author or contributor to about 50 academic papers, books and textbooks and which have had over 15,000 citations. [48]


September 30, 2016

Meet Henry T. Sampson -- The Man Who Created the First Cell Phone Back in 1971

What do Bill Gates, Steve Jobs, and Henry T. Sampson all have in common? They are all geniuses who pushed technology into the next century. We all know that Bill Gates founded Microsoft, which became the world's largest PC software company, and Steve Jobs as an information technology entrepreneur and inventor and co-founder, chairman, and CEO of Apple Inc. But who is Henry T. Sampson?
Confirmed on Wikipedia

Although many people are not aware of it, Henry T. Sampson is an African American inventor, best known for creating the world's very first cell phone. Information about him on Wikipedia states: "On July 6, 1971, he was awarded a patent, with George H. Miley, for a gamma-electrical cell, a device that produces a high voltage from radiation sources, primarily gamma radiation, with proposed goals of generating auxiliary power from the shielding of a nuclear reactor."

On April 3, 1973, using the patented technology they created, Motorola engineer Marty Cooper placed the first public call from a cell phone, a real handheld portable cell phone.

Sampson, a native of Jackson, Mississippi, who received a Bachelor's degree in science from Purdue University in 1956, also graduated with a MS degree in engineering from the University of California, Los Angeles, in 1961.

He was also the first African American student to earn a Ph.D. in Nuclear Engineering in the United States, from the University of Illinois Urbana-Champaign in 1967.

He made cell phones possible

Because of Sampson's invention and patent of the gamma-electrical cell, portable cell phones were possible, using radio waves to transmit and receive audio signals. It literally changed the world, and the way in which we communicate in our professional and personal lives.

So, when you think of communication technology giants, make sure you also mention Dr. Henry T. Sampson!


What Is A 'Millennial' Anyway? Meet The Man Who Coined The Phrase

When Neil Howe and William Strauss coined the term Millennial in 1991 they weren’t sure it would stick.

The historians introduced the phrase in their book “Generations” which charts American history through a series of cohort biographies. The pair demonstrated a predictable cycle where generational personalities form in opposition to their immediate predecessors but share significant traits with groups they may never meet. People born between around 1980 and 2000, for example, shares many traits with the group born from around 1900 to the mid-1920s. Howe and Strauss called this the G.I. generation but it's more commonly known at The Greatest Generation.

When the book came out generational study was not in vogue. As a result the group we now call Gen X didn’t have a name, even as the oldest of the people born after the Baby Boom were about to turn 30. Howe and Strauss called them 13ers because they were the 13th generation since Benjamin Franklin. Clearly society hadn't given much thought to what they would call the group that came next.

More than two decades later Howe explains that they chose Millennial because their research made it clear this generation, just eight-years old at the time, would be drastically different than the one before and therefore needed a distinct name. Plus, the oldest of them would graduate high school in 2000, a date that loomed large in the 90s.

To be fair, Howe believes every generation is distinct from the one before. (“If every generation were just an exaggerated version of the generation that came before it civilization would have gone off a cliff thousands of years ago.”) But kids of the day were being raised with so much structure and protection compared to the generations that immediately preceded them -- both Gen X and their mostly Baby Boomer parents -- that they were destined to leave a very different mark.

"When you look at the future generationally you begin to see how the future unfolds in non-linear ways," says Howe. "When we saw Millennials as kids being raised so differently, we could already make an easy prediction. We had seen this dark to bright contrast in child upbringing before many times in American history, so we already foresaw that by the time you got to 2000 you would see huge changes in people in their late teens and early 20s." They predicted the crime rate would go down, families would be closer and these 20-somethings would be more risk averse. All of this turned out to be correct.

Judging by a FORBES article published in 1997 the term didn’t take hold immediately even as the sense this group was different became popular. “Good-bye to body-piercing, green hair, grunge music and the deliberately uncouth look. Hello to kids who look up to their parents and think bowling is fun,” wrote Dyan Machan. “Whatever the post-Generation-X kids end up being called, it looks like they are going to be a lot different from the generation that precedes them.”

Among scholars the term began to take off in 1998 with its use in books peaking in 2000. ( Google Books data ends in 2008.)

Colloquial use seems to have come later. Google Trends data, which begins in 2004, shows near zero interest in the term as recently as 2005. Searches for Millennial/Millennials grew slowly from there, picking up speed around 2013 and skyrocketing this year.

Just because Millennial has become widely used doesn't mean everyone has accepted it. Many people, or at least a handful of very loud people, hate to being dubbed Millennial. They see it as derogatory. Some will stand down upon learning that technically Millennial is a term to describe the group of people born between around 1980 and 2000 (the end year is still being determined and the start varies a year or two depending on who you ask). To others, the term has been too maligned with insults like narcissistic, entitled and lazy to be accepted as neutral.

"One person's narcissism is another person's healthy self esteem," observes Howe when I ask him why some people bristle at the term. Millennials haven't had it that bad when it comes to inter-generational scorn, he argues. We like our parents. Our parents like us. Sure some older workers didn't initially love having us around the office but that's normal. No one wants to be supplanted.

He also points to German demographer Wilhelm Pinder's century old argument that every generation has three types: the directive, the directed and the suppressed. For this generation Facebook founder Mark Zuckerberg is a clear member of the first group. Most others fall in line with the trends he leads and therefore fall into the directed camp. Finally, the suppressed fight against the generational persona.

Back in 1991 Howe and Strauss, who passed away in 2007, explained it this way:

In this book, we describe what we call the 'peer personality' of your generation. You may share many of these attributes, some of them, or almost none of them. Every generation includes all kinds of people. Yet, [. ] you and your peers share the same 'age location' in history, and your generation's collective mind-set cannot help but influence you--whether you agree with it or spend a lifetime battling against it."


Watch the video: Πως θα δεις τα ένσημα σου online (May 2022).