Archive for the ‘Notes’ Category

How it’s all going to end.

Sunday, December 9th, 2012

There are many problems in this world having to do with resource consumption and overpopulation and disease but we keep plodding along. The lucky among us have it pretty good, and the unlucky among us don’t have it so good.

Some people make out well because they never catch a disease our current medicine can’t deal with, or some operation can’t take care of. Some people are unlucky and end up debilitated in some way and they either have to deal with it or have somebody expend resources to keep them alive, or we die.

But one way or another everybody dies eventually. All the effort that goes into medicine just keeps more people alive that much longer.And that seems to be the goal: to live a long time. Or not suffer, my personal goal is to not suffer or minimize the suffering while I’m here, but I know suffering will come and then I will eventually die and the next generation will be here and suffer the same situation they way it has been going for thousands of years.

Now let’s take a step back and look at it from the other end. That was the small local my-lifetime view of life. The other larger picture view is that eventually the Earth is going to go hurtling into the sun, or the sun will give out or it will go nova and kill our solar system, or I suppose the Earth can go flying off into space, or it could get hit by another planet and get knocked out of orbit.

One way or another eventually the comfy situation our planet resides in that enables life to exist, will come to an end.

I don’t think it’s likely that humans will be on this planet when that happens, because we would have died off for one of many various other reasons before then. A meteor can hit. Some horrible disease can wipe out all the humans. Or we could have a nuclear war.

The cause may be man made or not or maybe half-man-made, but the fact is, eventually there will be no humans on this planet.

There are some optimists that think we’ll escape the planet before the worst comes. I doubt it. It never ceases to amaze me, the marvels of technology, but even if there was a magic physics trick we haven’t figured out yet that allows for instant interstellar travel, there’s still the problem of finding somewhere safe to live, and then moving lots of people and materials and other resources there so it can be colonized in a comfortable manner.

And all that takes a lot of energy one way or another. And it’s hard to imagine any space bending physics trick that is so energy efficient that we can pull it off with the energy available to us by digging oil out of the ground or solar or wind or wave power.

I’m having a hard time imagining such a thing can exist, let alone humans having the ability to figure it out. If it did exist, and it was easy, and not massively resource intensive, it seems to me we would have been visited by aliens by now.

So we have this: Lots of people now here on Earth who will each individually die out when they get old or catch some disease or get killed by some accident or shooting or something.

And we have the future where there are no people on the planet because they all died because the sun ate the solar system, and nobody managed to escape because we never figured out how to or where to go.

Sometime between now and then there is going to be lots of unpleasantness for a lot of people, who will die in a miserable way long before their natural age would have killed them. And it will happen in a relatively short time frame. And before they die, they will suffer because there will be lots of other desperate people in a similar situation looking for food and water and shelter and they’re not going to be nice and ask you for yours. And some of them are going to watch their friends and family die before they do.

If you think the zombie apocalypse can’t happen, take note of the fights that broke out on the gas lines after hurricane sandy. And that’s when there was still law and order and food and clean water, and people just wanted gas for their cars so they could get to work.

Hurricane sandy was a wake up call for very few people. But it showed how small of a problem can cause our existing daily-life process to fail. The people who lost their homes in the hurricane have a newfound respect for how fragile our modern ecosystem is. But the problem was contained to a relatively small portion of the planet and the economics allow for the rest of the world to continue plodding on, the storm having had no real effect on them.

Eight weeks later, most of the infrastructure has been repaired enough to make most people happy, except for those who’s homes were demolished and there’s plenty of time until the next disaster for them to find somewhere to live and start forming a new daily life ritual.

Ever since the first farmer traded food he grew to somebody else for a service like having his shoe repaired, or a product like a better tiller, people have become interdependent on each other to the point where most people’s jobs are to push papers around having nothing to do with collecting food, water and shelter which is all we really need.

So now we require electricity and fuel to keep our food cold and heat our homes so we can quickly eat and go to work where we will need clean clothes and a car that also consumes fuel to get there. The list of interdependencies we have created in our modern world is beyond comprehension but somehow most of the time, it all works. That’s likely because it grew slowly, as people found new crap to sell to other people for the hard earned money they work for.

So we came to the end. At some point something bad is going to happen. Something that disrupts the existing flow of money and resources to the point where there won’t be enough outside resources or manpower or expertise left to compensate and bring it back to the status quo we are living in now.

When that tipping point is reached, things are going to get very ugly very quickly for a lot of people. It may be centered on one country or one section of a country, or it may happen to the planet all at once. It may spread slowly, (more frequent storms/tsunamis/earthquakes) or happen quickly (a very-quickly-spreading contagious disease).

Assuming it’s not a planet-wrecking event like a huge meteor strike, the poor countries that live on subsistence farming will probably fare the best. People who live on island cities like hong kong and new york city will fare the worst.

If all of the bridges in new york city were made impassible, and the electricity didn’t work to run refrigerators, it would be impossible to fly or ship in enough fresh food fast enough to keep people fed. They would get very hungry and start acting out of character to solve the problem of getting their prime necessities met. This probably includes being violent. And they probably wouldn’t dress nicely and go to work and keep paying their cable bills.

I’ll leave the rest as an exercise for your imagination, but while we’re living in our comfortable status quo, keep in mind that it is going to happen one way or another, sooner or later, it’s going to happen. It probably won’t happen in your lifetime, but it will happen to one of your descendants.

Upgrades

Saturday, May 12th, 2012

When was the last time you installed a software upgrade, of any kind, on your computer, and it went faster when it was finished?

I only thought of this because the last two projects I worked on at work were solely for the purpose of providing a better customer experience because response time was better.

How many software companies EVER do this?

Everybody just assumes you’ll upgrade hardware so you never have to worry about speed, so software gets slower and laggier.

Everybody touts google, but they do the same thing. Sure it’s lots faster than anything else to begin with, but it gets slower and more bloated just like everything else. It’s always fun to go to the wayback machine and view source on old versions of google’s home page compared to viewing source on it now.

But what does it matter, most software is web based now, right?

Well look at google searches, then look at ebay searches, then go actually pay for something with paypal. I guess paypal is about your money so it’s important that it go slow.

Whatever.

 

 

Video of the future is annoying.

Friday, March 23rd, 2012

You can interpret that however you like.

I just tried to watch this video: http://www.buzzfeed.com/catesish/leaked-battlestar-galactica-blood-and-chrome-tr

And I had to give up. Instead I started counting edits. As in scene cuts.

I counted 164 of them. I’m probably off by a few because I don’t play video games and thus can’t keep up with that high of a frequency of context shifts.

The video is 96 seconds long and has 164 cuts spread evenly throughout the video. That’s  about 1.7 cuts per second.

Does anybody actually enjoy or even glean any content from a video that you have less than a second to focus on, and soak up the image before it switches to another one?

Who thinks this is a good idea? What marketing genius said “max headroom’s blip verts were a brilliant idea! Let’s DO IT!”

 

The makers

Wednesday, February 15th, 2012

In the long term, our society will either have to change drastically or collapse. As technology progresses, more can be accomplished
by fewer people in less time. This means less work to do for an ever growing population.

It is true that new ‘things to do’ will come up, but as we apply technological progress to them, again there will be less people needed to do them and more people on the planet. This is called progress. (likewise by the way, if we don’t get off the planet before we use
this one up, we’re doomed too, and that seems more likely to happen first)

The reason I bring this up is because of the makers. You may or may not have heard about this wonderful technology called 3-D printing. The basic idea is that instead of molding plastic, or cutting wood, you print the thing you want or the part you need on a 3D printer. Sounds like death to manufacturers to me. Until now these things have been big and expensive but recently there have been printers that cost less than $2000. There are plenty of limits don’t you worry. At the moment they can only make things out of certain kinds of plastic that have limited size (I think they said a breadbox is the largest thing you can make right now) and it’s really slow.

But. That will end quickly. They’re working on making high temperature plastics, and even metals for printing. So the problem quickly becomes that you no longer need to buy anything you only have to buy the plans and print it yourself. With your expensive
personal printer. Except that the printers at this point can almost make all the parts they need to reproduce themselves. (humans are what 3D printers use to reproduce themselves.)

So the only commodity left on the planet are fuels for energy and fuels for printers. Everybody else is out of a job.
Now of course there will be plenty of work in assembling parts and there’s endless office work to do. People can push meaningless paper around forever, so that will occupy some people. But at some point the printers will be able to print robots that can do the parts assembly. Right now they can make all the parts of the printer, but they can’t make the circuitry or the programming so this is still a long way off, that’s still really high tech stuff, but you can see a black cloud forming.

Now forget all that, here’s the real problem: The people who make the plans that tell the printer what to print are going to be the powermasters for a little while. They hold the keys to the kingdom. Except that like music and movies, actual physical STUFF has now been turned into digital media that can easily be copied and pirated. There’s the free software people that will spend endless hours designing plans for a cam shaft for a 1968 corvette for free which will rob chevy of any future sales of corvette cam shafts.
Of course there’s the business people who (will) expect to make a business out of selling plans for parts for everything you would ever otherwise buy and end product of. for. of. Never end a sentence with a preposition. But as soon as one person buys the plans, he can netflix it over to his best buddy and then quickly there’s no market for plans either.
So how will anything ever get done if there’s no value in any work or any products? Nobody will be able to work for money and nobody will have any money to spend on materials. Remember the only thing you have to buy anymore is ink for your printer, everything else is free or stealable. So things will have to change. One way or another, something is going to happen. Probably not in my lifetime, but it will happen, you can’t unmake these printers. You can make them illegal, but that will just foster a black market or
a regime change.

But like I say, we’ll probably burn up the planet first.

Who knows, maybe land will become scarce, since you can’t print that.
Buy land, they ain’t makin’ any more.

 

A solution to the wikipedia problem.

Saturday, December 17th, 2011

I just came up with a solution to the wikipedia problem. Every year wikipedia goes on about the millions of dollars they need to keep running. Wikipedia is a volunteer effort just like the local volunteer fire department. The labor is free, but the equipment and resources are what cost the money. Most people don’t have fire equipment to donate to the local fire department, but they do have computers that are idle most of the time.

Wikipedia is the perfect system to run as a world-wide-distributed application. People volunteer content, and they can volunteer cpu and disk too. How big could all the wikipedia data possibly be? It’s mostly text. You know there’ll be some geeks out there more than happy to have copies of the entire thing, and everybody else who contributes disk and cpu (just by running a little application on their pc) would host caches of sections of the whole database. Not outrageous to imagine, and given the state of peer systems nowadays, not that hard to do. If wikipedia started building that system and transferring all their current data to it, they’d never have to ask for money again.

 

— later comments —

 

Light O’Matic  –  Well, my first thought is that they would have to have some way of protecting the content from just anyone being able to make their own version of it.. for example, javascript which does a checksum of the page against a per-page hash that is either fetched from a trusted server, or calculated cryptographically with a master key from a trusted server. Second thought was that they’d have to either make the whole wiki editing system work distributed.. or they’d have to keep editing centralized. Then I realized there are actually a lot of systems out there already that at least partly solve these problems and maybe one of them totally solves it…
Stu M's profile photo

Stu M  –  Well firstly realize, that there wouldn’t be much point to putting of fake copies of your section of the database, because… you can just edit the real thing. The effect is the same. But yeah, you could make it easier with trusted servers. What happens now? There are people who scour the changehistory list and just go and edit and validate and remove and stop flamewars. The same thing would happen, but the changes would have to propagate around instead of all being in one place. Not trivial, but I think in the case of wikipedia, it’s a lot easier than say bank records.
Light O'Matic's profile photo

Light O’Matic  –  They could distribute it with git… But maybe it would be simpler to just distribute reads and keep writes centralized. More of a caching scenario. The problem with people being able to modify their copies of pages is that I am assuming that any given page can be served from a lot of different places.. so if one or some of them have tainted versions, it might take a while to even notice it. Then you’d have to have a system to do something about removing that bad data. Whereas now, if you edit a page, everyone sees it, it’s very clear what happened. If I can server any data I want and pretend it’s from wikipedia, I could serve a worm or virus in otherwise totally legit looking pages. So, there has to be protection.
Stu M's profile photo

Stu M  –  I suppose you could go with the ‘signed by one of the trusted authorities’ type of thing, which would mean a certificate-like data included with all changes, but the trusted part would come from a top-down delegated authority, so the root ‘certificate’ would be signed by mr wikipedia himself and everybody in the chain would be trusted by him or the guy in the chain above him.

 

I have invented the fastest computer in the world.

Thursday, December 1st, 2011

The super zippy multi core crazy fast microprocessor in your computer spends well over 99% of its lifetime doing absolutely nothing.

On the rare occasion when you can manage it keep it a little busy you might hear the fan in your PC or laptop spin a little faster, but by and large your processor is idle most of the time.

What a waste. Most of the time the computer is waiting for you to read a web page or your email while it sits there and hums and waits for you to click the next button.

The problem though is that when you DO click something, you want it to respond quickly. So you have this incredible amount of processing capacity at your fingertips, so it can dance like crazy for you once every few minutes for a few fractions of a second and sit there useless the rest of the time.

But I have a solution. “What’s the problem?” you’re probably asking yourself…

I have designed a processor that takes all that idle processing capacity and stores it up, and then blasts through it when you want the computer to do something. In this way you can actually buy a lower capacity processor that functions much better than the current top of the line screamer. So it can be had for a lot less money and can be added to, to store more idle capacity for a lot less than the cost of a new processor or new computer.

If your processor fills up its processor capacity cache, you can sell the excess to big company server farms who are always for want of more capacity, or even “push” it over to your iphone or android machine. The market for this cache trade will be astronomical in size as more and more systems come online and intel and amd become less capable of enacting more and more of moore’s law.

You read it here first.

 

Linux vs Windows

Friday, September 2nd, 2011

It’s starting to sound to me like the cost of a windows license is cheaper than the cost of a lawyer to figure out if any and all
software you’re going to be writing software for/against/with will conflict with the zillions of linux related licenses.

I never thought of it before, but it sounds like the free software people are shooting themselves in the foot by having so many
different incompatible licenses. Actually I don’t know if they’re incompatible or not, but I’m certainly not going to pay a lawyer to
find out.

Now that’s just a cost-of-business kinda thing. I fully support anybody who wants to write any software and put as many or as few licenses on it having to do with statically building or non distribution or sale, etc… But you gotta figure, the end user (a software development company) is going to take a short soft look at “buy a windows license or figure out what we can and can’t easily use in the free software world” and they’re going to see that the windows license is an easier deal.

I tell ya, I’m a unix guy through and through, but at this point after hearing about all these different licenses, I’d lean towards
going with windows.

Print going away.

Monday, August 22nd, 2011

Is it just me or does anybody else also think that any publication that’s online only, isn’t as serious as something in print.

I’m sorry but there’s so much free shit and other pay shit on the net, why would I take your piece of shit any more seriously than just a list of links posted on facebook?

What makes it a magazine, and not just a bunch of pages that link to each other on a website?

It just seems lame and pathetic. Not cohesive at all.

If it was ‘an experience’ of some kind other than just linking from one article to another so easily pulled away by an errant ad placed here and there, I might be more inclined, but every magazine on the web is like every other magazine on the web. Just a bunch of free floating content to be found by google.

Which I guess makes the point: A print magazine is physically cohesive. You can’t accidentally look at an ad and end up reading a different magazine. You can’t find an interesting phrase and easily look up the phrase in the search bar and get drawn away by the wikipedia article on the subject. A magazine is a lot more than just a paper collection of articles. It’s a grouped pile of related information that is logically and physically tied together.

That exclusivity of grouping and physical attachment is what makes a magazine attractive over jumping from link to search box to link to search box.

Magazines that go online only do so because they can’t afford to print paper given that most of their readers are giving up paper for randomly flitting about … well, let’s face it… facebook. And it’s a dying art form and get used to it yada yada yada.

But I bet you won’t see the economist or the new york times going online-only until well after my generation is dead.

The end of swap.

Tuesday, August 9th, 2011

The other day I wanted to check out the new gnome 3 desktop for linux that everybody has been saying sucks so bad.
So I fired up a 4th vbox vm on my machine and installed it. Asking for another gig of memory for the vm I finally used up all 8gig of ram on my machine, and the most interesting thing happened…
It started using swap. I’ve had this machine for a year or two now I think, and I got 8 gig because I found swap annoying, and now I have proof. The problem is disk is getting bigger and bigger, and programs are getting bigger and bigger and memory is getting bigger and bigger, but the speed at which you can swap memory in to and out of disk hasn’t really changed much, certainly not in line with the memory and disk sizes, so what ended up happening was the machine would just freeze and the disk would spin for 15 seconds or so while a gig or two was swapped in or out of memory.

This made me realize that I think we’ve finally seen the end of swap. There’s no point. Memory is so cheap, you might as well just buy more memory and keep everything in it. Now I realize that using huge memory sucking vms is pretty much the worst case scenario, and there’s probably lots of small things that can be swapped out to disk due to lack of use, but when you start opening firefox (2gig resident at the moment) and chrome (another gig or two resident) you really run into the swap problem the same way, the VM just makes it worse faster.

Anyway, so along that stream, since I’ve decided never to use swap again, SSDs become a lot more interesting because you don’t have to worry about burning them out because of swap… So I found this.

http://www.newegg.com/Product/Product.aspx?Item=N82E16820227515

Another reason ipv6 is stupid

Monday, May 30th, 2011

I recently heard a talk about the demise of the internet as a result of the exhaustion of ip addresses with ipv4.

I always figured, ‘aahhhh, what a load of crap, just NAT the shit out of everything.’ But the speaker pointed out you run into the problem of port exhaustion on the internet-facing machine. Okay, point taken, I concede, the NAT forever thing won’t work. Although it certainly could last a long long time if they bothered to organize a little better, but I’ll let that go, we really are running out of addresses.

Still, the sky is far from falling, I have one really simple thought that made all of ipv6 really pointless and a terribly complicated exercise in wasting everybody’s time.

ipv4 has its share of problems, but the biggest one is that we’re running out of addresses, or rather in February, the IANA actually handed out the last batch. That’s it no more.

IPV6 was designed starting 15 years ago or so, and nobody lifted a finger to fix something that wasn’t broken. But in all that time, like c++ and everything else, they had grand plans, and they added features. IPV6 was going to streamline all sorts of byte wasting excessive packet size, it was going to enable ipsec at the ip layer (or something like that I forget the details) and they were going to add this useful feature, and that useful feature and so on and so forth for all 15 years that everybody was ignoring them and not implementing it.

But fast forward to now, and it turns out the only problem we ACTUALLY have to solve is that we’re running out of addresses.

ipv6 offers a 128 bit source and destination address, and the current rollout of ipv6 as it is being adopted is pretty much doing absolutely nothing other than solving the problem of running out of addresses. All that ipsec and all that other grand vision feature stuff is all gone. People are implementing ipv6 because they need more addresses and that’s it.

ipv6 was supposed to be many things to many people, but as it turned out, we only really needed the bigger address space.

Well if you look at the ipv4 header there’s got to be 3-5 bytes of shit that nobody ever uses for anything (like the fragment stuff), that just go to waste and could have been repurposed for an extra byte or two of source and dest addresses. It may not get you 128 bits of address but it would push out the address exhaustion problem a few centuries. It would have taken 1 guy maybe 2 days to hack it into the linux kernel (and you could even swipe a bit from the version to say whether or not this is a new-address-style packet so it could be backward compatible.) Microsoft would wait 2 years, then add support and say they invented it and are responsible for saving the world from the collapse of the internet.

But no. Instead everybody and their mother had to implement ipv6 which does nothing but add address space.

You almost can’t blame all those fucking morons. If they had just set out to solve the problem that needed solving, they could have implemented the hack ipv4 solution YEARS ago and there would never have been a problem, people would have had plenty of time to implement it before we started using the ‘extra’ address space.

But no, they had to design the next great thing which was going to solve all the problems of networking in one fell swoop. And because they’re fucking morons, they’re too dim to see that every other fucking process in the world falls apart the exact same way, and therefore could not have predicted what actually happened that ipv6 would be pared down to its one useful feature.

No, you can’t blame them, because they’re too fucking dumb.