Some thoughts from Earth 2100 by ABC

Just watched Earth 2100 by ABC. A film done in 2009.

I have to say, the world they envisioned as the last part is, somewhat a fiction, available to a nation with scientists as its only citizens.

The Copenhagen summit was a failure, I would say. It appears despite the concerns expressed by the scientific community, the rest of the world are still living for their own interest. Without the elimination of certain selfish aspect of humanity, we, as a spice, can not be saved.

Humans are bounded to get along the known, predictable worst path. It is caused by sin, humanity or whatever name you give. Only through creative, wild, bold actions, are we able to proceed without destroying ourselves.

Like how creatively we’ve managed to harvest nuclear energy which was first used as a WMD, similar actions are required. Maybe, moving to the space is finally the only feasible solution we are left.


An interesting line from Bjarne Stroustrup

The major cause of complaints is C++ undoubted success. As someone remarked: There are only two kinds of programming languages: those people always bitch about and those nobody uses.

—Bjarne Stroustrup, comp.lang.c++

Copied from Wikipedia

The gap

There usually is a gap between hardware and software. Especially between hardware computational power provided by chips, boards, and buses, and software computational power provided by OS, plug-ins, environments. Focusing on the gap between OS and hardware, the industry is pretty much in an awful situation.

On the OS side, providers are trying their best to make difference between different platforms unnoticeable. Like what Apple did for iPhone, moving Mac OS X platform on to daily smart phones, Microsoft is moving Win 7 to windows phone 7 as well. This is probably what most developers would like to see for hand held as all sorts of APIs, technologies used in daily creation of modern software may be ported to those devices to provide sometimes a better user experience.

On the other hand, hardware for desktop or laptop has changed dramatically but hand held stays the same.

Moving from one core CPU to multi-core system we have now creating a gap of which hardware is leading. Most our daily softwares are designed for single core system. Only a handful of few utilizes part of the multi-core capacity. As parallel computation is no longer for GPUs, servers and scientific computation only, it has been widely noticed. Universities start to make it available for undergraduates while it is still a graduate study topic. Differing from distributed system, professors tend to focus on parallel techniques on a single system. We have recognized the fact that software is behind hardware for these environments and it is necessary to seize the gap asap.

Looking at hand held devices, among which smart phones are like a relic, hardware has not changed much. Motorola put 1GHz CPU on top of the spec list for their new phone while this was the threshold for PCs in Pentium III era. Same for memory. A while after PCs proceed to GB, smart phones finally catch up and got GB size on board memory. Although most of the memory are actually used for storage, it is still a great progress. When this is coming true, PCs are at TB era already. Hand held devices are trying to keep their paces and catch up. This means the gap is reversed comparing to desktops and laptops. On hand held devices, it is software which is ahead of hardware development.

This awkward situation make the OS providers’ job even harder and more important. While making the porting for modern technology on PCs available for hand held systems, it is even more important to create an OS platform so the hardware problem is left outside of the developers’ scope of concern. On the mean time, new techniques are required to make both improvements possible for a smaller memory, smaller storage and a slower processing unit.

The combination of parallel computation and functional programming may have provided a solution. As most work are done at by compilers and immutable nature of variables, it is possible that this combination may provide a stable improvement for hand held systems to catch up. Of course, we still need a larger storage and a low heat multi-core CPU system for these devices.

An OO view of time travel

This is just a snippet, please contact author, me, for further discussion:

The Object:

According to physics, everything in this world is a unique stream of energy and matter. All energy and matter can be viewed constructed by a combination of wave and particle. Assume, we view this pattern as an Object, name P.

For each P, there are fields wave and particle to ensure its uniqueness. Also for each P, it has a real value entropy which has been proven (as far as I have leaned) non-static. Which means entropy changes as a function of time. let’s call it En(t).

For conclusion, each P has following structure:

Object P
Wave     w;
Matter   m;
Entopy  En(t);

The Space:

The space we are discussing in, mathematically, can be viewed as a vector space. Elements of the space are vectors from one state of P to another. Let’s denote Vij being the vector from state i of P to state j of P in the space. Thus, all components of the vector are functions of time t. Let’s also denote them as w(t), m(t) and En(t)(t) = En(t^2). Note that if we plot the functions in a matrix with each row and column numbered with state, we have a matrix representation of state machine for P. Let’s denote such space S and the state machine matrix Mstate. In this S, we define + being vector + and multiplication as vector multiplication.

Let’s also define a adjacent matrix for all objects P at time t. Let aij of the matrix be 1 if Pi has a connection to Pj. That is, like two person, if person i was talking to person j at some time t, then aij is 1 as Pi and Pj are connected at time t. Naturally, this matrix is also a function of time t, denoted as adjM(t).

The transformation:

Assume time travel is possible. That is, it is possible to calculate both Mstate and adjM(t) for some given t not consistent of current time t-current. The computation involves:
i)   For past time t<t-current. As table look consumes too much memory space, a reconstruction of adjM(t) from adjM(t-current). A reconstruction of Mstate from the current Mstate matrix. The construction of previous Mstate matrix means the recalculation of w(t), m(t), En(t)(t) from w(t-current), m(t-current), En(t-current^2) for some Object P.
ii)  For future time t>t-current. Predict adjM(t) from adjM(t-current) and predict all entities related to P for time t from time t-current.

As prediction with none given pattern is well studied and NP-hard, we can’t “calcualte” the future. The problem remains if it is possible to “calculate” the past.

It is natural for one to think of using dynamic programming and back-trace to time t from time t-current. Under the assumption that all choice we make are efficiency driven, it is possible to give the optimal solution of t-current based on some previous time slice t. This may grant the possibility of “reverse engineering” the dynamic programming algorithm and back-trace to the state at time t.

As a matter of fact, as the assumption does not always hold, most events in this world happens in a greedy algorithm fashion. Any choice we made are based on the information we were given and thus the optimal for time t+1 is just picking the best we can see at time t. Thus the back trace has to be in a greedy algorithm fashion. And, by nature, finding the optimal choice and back trace to previous time slice are equivalent to each other. Thus, if for some t-current, we can construct, from time t0 to t-current, states of all time slices using greedy algorithm regardless of correct matching of discrete events, we may very well have a traceable path to the past for some time t within range [t0, t-current].

As we’ve rejected dynamic programming, let’s focus on using reverse greedy algorithm to solve the problem. Note that if we have all possible choices for arbitrary time from t0 to tn shown as nodes. Define an edge exists between two nodes ni and nj, if and only if nj is choosable at state ni and ni can be back-traced from nj. Under this construction, we have a n-partite graph, that for all nodes at time ti they are naturally grouped and all nodes does not have any edge to any other nodes in the same group. Define the weight of an edge being it’s benifit one get by choosing this edge, we have a longest path problem from t0 to t-current between node n0 and n-current, which are both singleton and assumable known. This is an NP-hard problem and thus not calculable.

Thus, as we may have a polynomial time reduction from longest path problem to time slice reconstruction, the reconstruction problem is NP-hard. As a result, the equivalent back trace time problem is NP hard.

For conclusion, we have both time traveling to the past and future NP hard for computation. Thus, time travel is not possible for now.

The previous arguments are under following limitations:

a) All events we discussed here are restricted to human activities. We do not know how mother nature works. We do not know how to represent states… This sucks but this means we don’t talk about it here.

b) All discussion are restricted under modern computation capacity. If some genius figured a way to make it not NP hard, or even make NP hard problems solvable… Heck!!! I wanna see him!!!

c) All discussion are restricted to the author’s knowledge. After all, I’ve only finished undergraduate…

d) Please, be nice, this is wrote 5 in the morning…


So, Surrogates huh? So we just plug in and that’s it? So we just CAN give up our body and live THE life we want huh? I have to say, at least on this I’m with the "good will" doctor.

Science advances one way or another. We always find a way to get around the protocols, the laws, the rules and even, as it has been said, god’s will. The Reconnaissance did bring technology back into our life, it also took away the fear of misusing it. Church was commanded by a group of non-tech fanatics who are against all technologies. Late we may be miss-led by some high-tech fanatic that want nothing in technology’s way. Both are wrong.

Yes, the Father of Surrogacy is a murderer. If he is TOTALLY wrong, why did Tom hit the NO button? Technology is an addiction in a lot cases. Internet can be an addiction for teenagers why not adults then? What does it mean to be a human? Maybe God, who may or may not be there, is right for one thing: being alive is being in pain and suffering. This is just the way it is.

I’m always saying be a nut but be a proper one. Being such a nut is not easy. Learning to give up is not anyone can do. This needs to be reminded once after once and time after time. This is what government for. This is what law makers’ jobs are.

Maybe, after all, it’s up to us. The guys create those technologies. The redemption are not given by others, but the creators.

Section 8 and Flashpoint:DR

Seems that I’ll never get it.

Just been through Section 8, Coder’s story. Really weak story. I mean, there’s pratically nothing in the story. If it is meant to be a multiplayer game, then don’t give me a story line. If you r giving one, give one better. At least like Hellgate: London.

Just started with Flashpoint:DR. Well, conflict with China is realistic in some way. Why not Russia? It’s more likely isn’t it? The system is too complicated. As a shooter, it’s bad. can’t really see ppl. Being realistic does not have to be unreasonable. How many ppl playing games are trained as marks man? Besides, units in Bino looks funny. It’s like from a 90’s game. What’s the point of being fancy then?

I just didn’t get it. Games suckin up hard disk. Pushing graphic into “next-level”. HL2 didn’t use DX10 and it’s pretty good to me!!! We play for fun, not for fancy. Dress may be fancy but not programs. I blame commerce ppl for this.

I’ll never get it. I get it.

Erlang, well…

Finally get one of the problem right. A spark on the road, led me to the solution for extra perantesies.

Hell… I don’t even know what I was doing half a year ago. My first encounter with Erlang was not so pretty. Ugly, if I may call it.

Referencial transparent was not a big fan to me. Certainly the other way around. I still remember during 311, dealling with Scheme… Ah~~~ 19 for the first mid-term out of 50. That was brutal. Luckily, I got those things right, finally, by the end of the second thirds of the term. Enough for me to catch up and… level my grades to pass the course. Guess I learned something after all.

Brutal, huh. Seems it is a big fan of me. Certainly NOT the other way around this time.