The New Milenium Research Council released a study today:
Though it is widely understood that broadband technologies that allow rapid and ‘always on’ connections to the Internet will provide significant benefits to the U.S. economy, this report is the first to estimate the economic benefits to the nation due to cost savings and output expansion resulting from the use of broadband technologies for an important specific sub-group of the U.S. population: the roughly 70 million Americans who are over 65 or under that age but have disabilities. Three types of benefits from broadband deployment and use are addressed: lower medical costs; lower costs of institutionalized living; and additional output generated by more seniors and individuals with disabilities in the labor force. Considered together, these three benefits are estimated to accumulate to at least $927 billion in cost savings and output gains in 2005 dollars (with future benefits discounted for the ‘time value of money’) over the 25 year period, 2005 to 2030. This amount is equivalent to half of what the United States currently spends annually for medical care for all its citizens ($1.8 trillion). As large as these benefits may appear, they are line with previous estimates for the benefits of broadband for the population as a whole. Policies designed to accelerate the use of broadband for these populations, however, could significantly add to the benefits, by cumulative amounts ranging from $532 billion to $847 billion (depending on the wages earned by the additional working seniors). The policy benefits are as substantial as what the federal government is likely to spend on homeland security over the next 25 years. Total cumulative benefits, under the right set of policies, could exceed what the United States currently spends annually for health care for all its citizens. Clearly, with so much at stake, policymakers have strong reasons to consider measures to accelerate the deployment and use of broadband technologies for America’s seniors and individuals with disabilities.
when you retire, your second life will be online. i had heard many a commenter mention their time constraints when faced with World of Warcraft or second life. is it unreasonable to expect a bimodal distribution on these platforms in the future? the young and the old certainly have the time. if these systems are able to attract older segments of the population, things will get interesting. actually, they already do.
if we leverage these enormous resources, ideally by making things like the mechanical turk or wikpedia fun for a large part of them, we’ll easily be able handle pensions and health care for a rapidly aging population, and still have funds left over for many more charity and nonprofit projects than today.
i always believed that a major reason for the bursting of the first bubble was that the internet experience of the average person is riddled with viruses, spyware and spam. it’s hard to overestimate how much this destroyed the trust and interest in all things internet. so maybe part of the appeal of these online worlds is there relative lack of annoyances (surely not for long..). what is needed, therefore, is a massive, probably grassroots, effort, to clean up the world’s computers and re-establish a safe browsing experience, and get these people back online. the rest will follow.
will the short and stupid end of the tail destroy the wonderful ontological ecology at del.icio.us? we will shortly know how much conceptual overlap there is between joe sixpack and the digerati. /. was once great, too..
ok, after a couple days of robots.txt love, i have now much less crap in my logs. a good opportunity to see which bots are well-written. based on what i am seeing with /robots.txt, i am sure glad i blocked most of these festering piles of dung from my site.
not using conditional get while requesting /robots.txt
Only kinjabot, OnetSzukaj/5.0 and Seekbot/1.0 get this right. All other bots, including google and yahoo, do not. lame.
requesting /robots.txt too often
The biggest offender is VoilaBot, checking /robots.txt every 5 minutes, every day. you gotta be kidding me. google and yahoo are not much better, you’d think they’d figured out a way by now to communicate the state of /robots.txt across different crawlers. Other bots fare better by virtue of being less desperate.
update: problems like this are economic opportunities.
i have recently started to use last.fm more frequently, again. i have had an account there since 2003, but had forgotten about it. in the meantime, they have built out an awesome service that works really well. it was a nice surprise to see that they upgraded early adopters to their version of a pro user, for free. this allows me to have my own radio station without lifting a finger, and other nice benefits.
coupled with their sane data policy, this is a clear winner. really useful and relevant unlike the overhyped silliness that is “podcasting”.
these days, mouse drivers for logitech are 35MB. WTF?
The displacement of population is the crisis that New Orleans faces. It is also a national crisis, because the largest port in the United States cannot function without a city around it. The physical and business processes of a port cannot occur in a ghost town, and right now, that is what New Orleans is. It is not about the facilities, and it is not about the oil. It is about the loss of a city’s population and the paralysis of the largest port in the United States.
george friedman has the best analysis by far. as usual, the MSM offer no insight beyond gory pictures.
meanwhile, alan has set up a comprehensive wiki.
john robb thinks a recession in the us is almost a certainty based on gasoline futures hitting $2.80. where is the basket of SUV manufacturers to short when you need it?
Elected representatives on committees that established policy at the highest level were motivated by base self-interest, expediency, and petty rivalries. They were not only ignorant, but uninterested in educating themselves. Given a choice between saving public money and spending it, they preferred to spend it. Allowed the option of destroying a city or leaving it unscathed, they opted to destroy it. Forced to choose between maximizing human suffering on innocent civilians or minimizing it, they chose to maximize it.
a must-read piece on sam cohen, the inventor of the neutron bomb, which he concluded, quite legitimately, was the most moral weapon ever developed. if history education were designed to prevent the eternal rehashing of mistakes, this is what would be taught. we get to obsess over times and places, instead of explaining the (lack of) thinking behind events that shaped the world. my history education was fairly short on recent developments, and i had to learn about game theory and nuclear deterrence on my own. considering how much they shaped the world we live in, i wish there was more emphasis on them. one way to do that might be to start from the present and work backwards. this would make sure you don’t run out of time just as you get to the present (happened in my high school, for sure), and would put the weight on what is probably most important today. on the other hand, one might argue that in order to understand the present, you need to be more mature, and therefore you are first presented with all these tales about ages past, until you grow up enough to hear the juicy stuff. another option might be to work with the arcs of history (page 4) that philip bobbitt had in his excellent the shield of achilles.
spurred on by dani, i gave trac a try for a project i am working on.
Trac is an enhanced wiki and issue tracking system for software development projects. Trac uses a minimalistic approach to web-based software project management. Our mission; to help developers write great software while staying out of the way. Trac should impose as little as possible on a team’s established development process and policies.
It provides an interface to Subversion, an integrated Wiki and convenient report facilities.
Trac allows wiki markup in issue descriptions and commit messages, creating links and seamless references between bugs, tasks, changesets, files and wiki pages. A timeline shows all project events in order, making getting an overview of the project and tracking progress very easy.
i’m in love. trac beats the crap out of bugzilla (no UI to speak of), RT (UI?), jira (likes to crash your servlet engine, not free), collabnet (slooow, not free), sourceforge (very poor integration, 1997-era UI), basecamp (useless for projects with both suits and coders) and a couple others i have tested (and forgotten about over the years). while some of the competition is stronger in certain areas, none are as well-rounded and tightly integrated between bug tracking, wiki and scm, or have such a pleasant UI. trac is the kind of application that makes me want to pick up python for real to play around with it (sorry, but plone never had the same effect for me). trac will go far.
this is the second time in less than 6 months that i have to migrate my work environment from one laptop to another. i bought a new T42 , probably one of the best deals out there. the 7200 RPM and the 1400 x 1000 screen make all the difference. just for comparison, this packs the same screen real estate you get with the 17 inch powerbook into 15 inch, you get 7200 RPM instead of 5400 RPM, and it sets you back $1800 instead of $2700. no wonder apple is changing their hardware platform 😉
as always, moving is a huge pain. this time, i have to deal with my 3GB, 250K file eclipse workspace, with countless settings across applications, logins etc. fortunately, there is freesshd for windows, which makes scp an option and allows me to sidestep all the SMB nonsense.