Advertisement saturation
Friday, December 31, 2004
I've seen the future of some computing applications - And I don't like it.
What is it with advertising, spam, Trojans and virii?
Advertising and SPAM are somewhat related, as someone is paying to send these emails to millions of people. Who is paying for such services?
I filter the SPAM email I get, however, once in a while I take a look at the content of some messages and the advertising companies are sometimes legitimate enterprises. I'm getting fed up - I have 13,000+ SPAM messages (Since I started filtering - Around 1 year ago) vs. 1,300+ legit messages spanning 4 years of emails.
I'm quite sure that if someone were able to invent something to stop SPAM altogether, this person would become famous. Perhaps, not rich (or richer than Bill Gates), as advertising has been a part of our modern civilization and whether we like it or not, we are consumers after all, and we must be targeted by someone selling something. I just wish they didn't use SPAM.
I remember a SPAM free internet (around 1992 and before) where there were only a few connected and the flow of information and ideas ran wild. Once the "internets" were popularized, even my dog got an identity and most importantly an email address. I'm realistic, though, and accept the popularization of the internet as a necessary evil. It was necessary in order to generate money from it and make it a mature business - Of what, we don't quite know yet - But, a business it is. I'm sure such popularization has paid my bills since the beginning of my
professional career.
Now, back to computing applications - Microsoft is willing to bet a big chunk of their cash reserves that the future of computing applications will be leased application. For example, you will be able to "rent" Microsoft Word to write a paper, over the Internet, from anywhere in the world and you'll probably pay on a per character basis. If you want to use the spell checker, you'll be charged an extra quarter of a dollar. I think it makes sense. Why own a computer (and thus, a licensed copy of Word) when every coffee shop and restaurant will have a cheap "internet" appliance at their patron's disposal. The architecture behind such system is going to be somewhat complex and they will have extremely interesting problems to solve - I'm glad I'll get to work on some of those systems.
But, what about the applications that are not made by Microsoft? I.e. No cash reserves to sustain the development and enhancement of such applications.
We'll be back to the go-go 90s once more and we'll have application that will be sustained by advertising - I'm cool with a web site offering advertising in the form of a banner ad, as my brain has learnt to ignore them quite well, thank you very much. But what about an application that flashes advertising while you are reading something? You probably wouldn't like it so much, however, this application is filling a gap and hence we must put up with their advertising.
I think it's cleaver to put the add where it is, however, I don't have to like it. And because I'm not willing to buy a reader, I will have to put up with it. After all, I am saying that this paradigm (I hate that word) will be the future of computing applications. Perhaps, they have seen the future as well: make money through advertising, or make the user buy a licensed copy of the software with no ads.
It's actually by value...
Wednesday, December 22, 2004
I found this blog entry today:
Why Java is better than C - Which points to this article:
http://www.jazillian.com/reasons.html.
The article has good points, however, the following statement has something wrong with it:
Java has no "varargs"
There's no need for this "varargs" stuff. If you have a set of fields that you tend to pass around together, put them together into a class and pass an instance of the class around. If you have a function that works on a list of objects (and the list can be any length) then just pass a List object.
This very topic comes into discussions among Java developers of all sizes and colors. It is also a minor detail that gets overlooked and it wasn't very clear to me the first time I really thought about it: Java doesn't pass instances of objects around, as the article claims.
It's the old question of "by value or by reference?" It gets a bit tricky, as it's very intuitive to say "by reference."
However, it is not 100% accurate (A nice way to say wrong): Java passes parameters by value ONLY, but, it's the value of the reference.
So there are no instance of objects being passed around, like it's suggested in the article. Nor, are references of the objects passed around - It is actually the value of the "pointer" pointing to the reference - I know, I know...Java doesn't have any pointers...But, it does...A pointer is a pointer, regardless of how much you abstract its meaning and implementation...
So, if anyone asks if Java's parameters are passed by value or by reference, the correct answer is: by value.
It is kind of a trick question, me thinks - The trick is to explain why it is by value.
JavaWorld had an article/Q&A about this particular topic (A long time ago). If you are still not convinced, try a couple of detailed examples.
Software Engineering and dogs
Friday, December 17, 2004
Everyone and their dog, has writen about Software Engineering. I don't have a dog, but, I've been known to stop and smell the proverbial virtual flower from time to time, and thus, I started writing this entry about Software Engineering and some of my very personal views on the field.
The Software Engineering Institute defines Software Engineering as:
Software engineering covers the development of software systems. Software engineers focus on applying systematic, disciplined, and quantifiable approaches to the development, operation, and maintenance of software.
The Scientific Method can be defined as:
The scientific method is the process by which scientists, collectively and over time, endeavor to construct an accurate (that is, reliable, consistent and non-arbitrary) representation of the world.
If we were to intersect both approaches, you would think we end up with the secret recipe to build 100% quality software.
The fact of the matter is that nothing is 100% free of defects. Thus, we have to deal with the statistical concoction of "rate of failure." In manufacturing settings, we can use this particular number to measure the failure rate of a given process. We've been doing it for a few years now, to a certain degree of predictability and satisfaction.
I started writing this entry a couple of weeks back, and while working on the draft, Microsoft started talking about
Software Factories. The Microsoft marketing machine at work: they almost make it sound (and perhaps wish) that Software System building should have the same predictable failure rates as any other manufacturing process. However, that is not the case: Software Engineering cannot be considered a manufacturing process. There is no such "piece work" that can be handed down or automated as in a factory line. No matter how many times Microsoft says it, it won't be true - Well, at least not yet - Perhaps in the future, but, not now.
There are some who have said (including myself) that Software is not built, but, grown. Software is "grown" iteratively and follows an optimization pattern. I.e. At each pass through, we build on top what's already been coded and each iteration yields a better module than the one existing before.
I'm sure you've already thought about software in this way. For example, given a set of business rules, with proper analysis already completed, you'd start building an arbitrary Software component in the following manner:
while (constraints are true) {
1. Architectural Design
2. Details Design
3. Coding + Unit testing
4. QA
Check your constraints. I.e. Software is good enough for our purposes
OR there's no more money left to continue development
OR management pulls the plug on the project
then our constraints are set to false.
}
At the end of this while loop, you hope to end up with a Software module that meets the original requirements and solves the business rule given to the Software Engineers in a reasonable amount of time and at a reasonable cost - Note that constraints are many, and arbitrary. Sometimes projects are just canceled for obscure reasons.
Each iteration in our while loop, above, is a "meta" iteration containing inner iteration that allow the Software to grow. There are different steps that can be used, in a different order and whatever life cycle you prefer: Iterative, Water fall, spiral, etc, etc.
I can also say here, that the quality of each iteration depends on various criteria: quality of Software Engineers, quality of development environment, personal problems of each developer, management incompetence, etc, etc. Everything affects the growing process.
So, I come to the conclusion of my entry.
The fields of Software Engineering and Computer Science (academic and professional), have existed for less than a century. We are still in the early stages of development, compared to other fields. The advancement of technology is tightly related to the
Moore's Law - Meaning that our tasks of Engineering Software Systems will only grow in complexity - There is no slow down for the growth of information and technology.
In addition to Moore's Law, we have other theoretical matters to considered. The field of Mathematics has a "Fundamental Theorem" for everything. We have fundamental theorems of: Algebra, Calculus, Arithmetic and some other I can't recall. Not surprisingly, Computer Science's Fundamental Theorems are based on the same Mathematical Sciences. And we also have limitation in the computing paradigm. Does P = NP? If you know, there is a
million dollars wainting for you.
One of the things that interests me the most, though, is the practical limitations of our current Software Engineering processes, our testing methodologies and quality of the software we build (or grow).
A few years back,
Fred Brooks wrote in his book
The Mythical Man Month, that the problem we encounter have to do do with the complexity of the systems themselves. I.e. Software is inherently complex and this complexity is one of the main characteristics that we cannot take away or even abstract, in order to make a suitable repeatable model (Similar to the Scientific method).
In other words, and to greatly simplify his essay, we suffer from combinatorial explosion. For example, if we consider a computing model based on a Turing machine implemented in a digital/binary architecture, we end up with n! ways of 0's and 1's to be combined, and each state of the machine yields a totally different result. Fully testing anything this complex, is humanly impossible - Also, it is very unlikely that every combination of the system is actually encountered in the life of any Software System.
Note: n! = n factorial = n * (n - 1) * (n - 2) * ... * 3 * 2 * 1Mind boggling - It means that we will never be able to tame the beast - We have no silver bullet.
What can we do?
For the moment, all we can do is to to invent new ways (and terms) to fake control over our processes. We can't just sit idle hoping that eventually we will write perfect code. A la: "1000 monkeys, in 1000 type writers will eventually code the perfect Java JVM."
So, many companies have emerged claiming their trade marked processes yield quantifiable results: ISO 9000, 6 Sigma, SWCMM (Capability Maturity Model), to name a few - Most of these processes have been adapted to Software Engineering, and are based on a proven track of success on the purely manufactoring field.
There is no doubt that attempting to follow either of these processes yields better software. After all, any intelligent action we take to better our building methodologies, will definitely improve the resulting systems.
We must note, that all the companies offering such services are for-profit companies, so there is an incentive for their methods to become the quality measurement standard of all Software Engineering. So, as with anything else in life, there are many attempted solutions to one problem, and it's up the implementer to pick the best approach for the case at hand. And as always, there are no guarantees any process will work and it is very constly to implement either one - A price that many companies are willing to pay for, in order to stay competitive.
BTW, I don't know much about the ISO process. However, I'm familiar with SW-CMM. And the only thing I know about the 6 Sigma process, is where the name comes from: the name is based on Normal Distribution theory. I.e. everything manufactured by a company under the 6 Sigma umbrella can "guarantee" (more of a "claim," really) that all manufacturing errors lie outside 6 standard deviations, having a N(0,1) p.d.f. However minimal the number of defective "whatevers" are, they will always exist, but only 3 in 1,000,000. So, no assurance of defect free process, just a really cool sounding name: 6 Sigma.
Applying any statistical measurements to Software Engineering is quite tricky - How do you measure productivity? How do you estimate progress?
Lines of Code are not a very good indication on productivity, so how can you measure the "rate of failure" of any particular system? I'm sure this very topic keeps many CS departments and PhD advisors busy reading and reviewing disertations all accross the top
Universities of the world.
At some time in the future, one of those disertations will have a good solution, or at least a good suggestion that will lead us in the right direction and get us closer to invent the silver bullet to our software beast - "If you had 1000 PhD students typing in a type writer, they will eventually...blah, blah..."
Another smile
Monday, December 13, 2004
There is so much written about the the Mona Lisa, and yet, there are still so many questions about who the actual sitter is: was she a mistress of someone with money? Was it really the Mona Lisa?
I did a little bit of research, while drawing my version of the original, I found a rather interesting theory of the sitter's identity - A particular writer - A
Rizah Kulenovic - argues that the Mona Lisa is actually a portrait of Leonardo's mother. I thought that it was a very interesting hypothesis.
Does it really matter at this point?
Frozen man
Thursday, December 09, 2004
A 4 year old child discovering Michelangelo's David for the first time:
Gabriel: Daddy [me], why is he frozen?
Daddy: He's not frozen; "he" is a statue made out of marble.
Gabriel: Who made him?
Daddy: Michelangelo.
Gabriel (with a thinking face): Michaelangelo? The ninja turtle?
Daddy (laughing): He has the same name, but, he's not a turtle; He was an artist of the Renaissance...
Rainbowmania
Tuesday, December 07, 2004
Everything one should know about
rainbows.
Sfumato smile
Over the weekend, my brother asked me why the Mona Lisa is so famous. The only 5 second answer I could give him was that she's famous for being famous.
There is a long answer art critics and art historians have, that justifies the painting to be one of the most recognized and priceless pieces of wood in our civilization, for example: sfumato, the art of perspective, the idea that the Mona is actually the self portrait of Leonardo Da Vinci, etc, etc.
Leonardo Da Vinci, was one of the masters of the High Renaissance period: he was a painter, a physicist, a musician, a sculptor, an architect, an inventor, a scientist, among other things.
I can't help to wonder that if he had invented the internet, would he have had the time to excel at so many things.
I've been studying and reading about the Mona Lisa on my spare time, and I have come to the conclusion that anything with long hair and a "mystic" smile, resembles the actual Mona Lisa. So, I present thee, the imaginary dude that looks like the Mona Lisa.
My charcoal study on newsprint, took a whole 20 minutes to complete - The actual Mona Lisa is an oil painting on a wood panel and it took Leo around 4 years to complete it. I truly doubt, my drawing will be stolen by a contemporaty conqueror, just as Napoleon stole the original, to hang in his bedroom, when he had in mind that he wanted to be ruler of earth.
Ketchup, Java boy...
Monday, December 06, 2004
Once upon a midday weary, coding away trying to figure out which version of the Java API has a certain method I needed, I started wondering how am I affected by new releases of Java?
It is common knowledge that to master anything Java, you must spend a great deal of time reading, researching, and coding new examples with whatever API you are trying to learn, as the number Java technologies (APIs, Frameworks, etc) you must be familiar with grows daily.
In the big technology game we call the IT industry, we (developers) are but pawns, that play with the most important group, in my opinion, that is affected by the pace of technological advances: the business group. In other words, the companies requiring Java Engineers. This segment of the demographics is quite interesting in the way it maintains itself occupied with the new technologies.
In the beginning of Java, Sun did something similar to what Microsoft did while establishing their empire: cater to the developer. Aside from illegal business practices, and all the other horror stories you hear about, Microsoft, way back when, released usable development tools: the Visual Studio suite.
Sun didn't have to push a buggy operating system to the masses. All they had to do was to unleash a new OO programming language to developers (Who are eager to try anything new) and with a bit of luck, industry timing, and lot of smarts from Sun's employees, Java took root among some of the early adopter.
Most Java projects, I'm quite certain (I was part of a couple in 1998), started with one Engineer playing around with the new language, and he/she saw that it was good and brought up his/her results in some status meeting at some cold boardroom. If someone with guts and enough clout was present in such meeting, he/she escalated the findings to the next level and proposed the new solution, got a bonus at the end of the year for such great vision and innovative thinking, and Java became the standard for development tasks.
The Java industry grew out of the trenches, with the aid of viral marketing and word of mouth. The Java language spread all across the corporate world like unwanted spyware as new the new hammer to all programming problems (After all everything looks like a nail to a hammer).
Sun, at this point realized that they had stumbled into something revolutionary and decided to throw money at it to grow it as a business and, most importantly, as a Sun brand. Sun's plans worked only in one respect: Sun == Java, Java == Sun; therefore, Java is a Sun brand.
The only problem Sun has, and has had in the past, is that they don't know how to make money off it.
It's a great brand, yet they can't market their technology to generate direct profit from it. Just the fact, that I work with Java technologies (at a minimal cost; 0; nada; zilch), prevents them from generating any direct revenue from the sale of their APIs. The minute they start charging for Java, their development network will drop. The cost of learning would be to great. Thus, their viral marketing scheme, that helped them build the Java empire, would crumble.
Note: some would argue that selling a server to host a Java application is a sale that was generated because of Java. In the purest sense of the words "direct profit," I don't consider it direct profit, however, in the end, profit is profit regardless how it was generated. Also note, that I'm not debating that Java started a whole new industry and profits have been made because of it - Including my Salary for the last few years - My point is: Sun can't generate "direct profit" from Java.
Forwarding back to the present, Java has been, is, and will be a established Architecture/Technology/OO Language in the IT industry and so, with any mature technology, improvements must come. In Sunspeak, this means that a new version of the language comes out every year or so. That leaves us with about 6 different version of Java floating around in the corporate world: Java 1.0, 1.1, 1.2, 1.3, 1.4, 1.5, and soon to be released 1.6.
Each version has incremental changes and each version also has some legacy code for backward compatibility. However, not every feature of the language is backward compatible (This fact is what led me to write this entry). And this is not good for Sun, nor the corporate world: there are too many choices of, essentially, the same (yet different) thing.
This evolution of the language, leaves the corporate world wondering which technology to support and which version to write their application with.
If a corporation has an existing Java application (Likely J2EE), there are 2 aspects that make the change of versions expensive:
- Application server costs
- Application development costs
Application server vendors (IBM, BEA, Sun, et al) probably like the idea of releasing a new server version every time there is a new Java version. Each new server release requires a new licensing agreement.
The corporate world, however, doesn't buy into the new technology as easily as the vendors would like. The main reason for this, is #2 above. Application development cost grows as the complexity of the development tools grows. It also means that the corporate world needs to spend time and money to train the existing staff with the new versions of Java, or spending the time to migrate an existing application to the newer coding syntax.
Either way, there is a cost of lost productivity allowing Engineers to learn new technologies or to migrate to new versions of Java - And if so, which one?
With costs of migration to the new technology, comes decreased revenue and hence, profits decrease - Share holders don't like the words "decreased" and "revenue/profit" in the same sentence, unless, the context of the conversation is gravitating around competitor's revenues/profits.
So, we have a couple of interesting phenomena with so many released versions of Java: the first one is that Sun doesn't make any money every time it releases a new version of Java; and secondly, businesses using Java are not migrating to the newer versions.
I already talked about the "no profit" for Sun, and as in terms of the business world not migrating, what is actually happening inside corporations' R&D departments is that some are developing to Java specifications that are 2 or 3 versions old.
It's a really expensive proposition for a Software Engineering shop to migrate to "Tiger," when most of your architecture is designed towards a J2SE 1.2. Some are using version 1.3, however, you don't see many 1.4 apps, and there will be less with 1.5 or 1.6.
Note: You may argue that if a system is well architected, there is no need to worry about the implementation details. In theory, this is correct. However, to design a large system, you must know what platform it will run under. So, the choice of technology comes into play early on in the design process.
Back to my original question, now that I have explored the impact on new Java version in the corporate world: what does this tiny bit of knowledge mean to Java Developers?
There are many aspects to explore, however, I will only talk about one: profits and revenue for the Java Software Engineer.
I can make my argument with aid of marketing and economics: there are early adopters to anything that is sold out there. For us, developers to be able to sell our services, we must ride stay on top of new technologies. I.e. If you are able to secure a contract or employment using 1.5 technology now, you are probably going to be handsomely remunerated for your efforts. Now, as the market gets diluted and more and more business start using 1.5 (or 1.6) standards, the supply of developers increases and your service will not be so much a luxury, but, more of a commodity (more developer out there == less pay for you). Of course, all that knowledge gained while developing with the new technologies is greatly valued, so you are still differentiable among the rest of Java developers and very much employable for the foreseeable future. I.e. someone will have to lead the new developers.
The moral of the story: early adopters of new development technologies will benefit greatly for their efforts. You must note, though, that the number of early adopters among the corporate world is limited - They are out there, but, you have look a bit harder to find them.
Having come this far, another question popped in my mind: should Sun stop releasing new versions of Java so quickly? And, should companies try to catch to up to Sun's release schedule, and migrate "old" code to newer versions of Java?
In the business world, it seems to be that the one with the most money will win the tug of war. And, there are more companies out there than there are Suns. So, I predict that companies will be a lot slower to adopt the new versions, though, I hardly doubt Sun will slow down their release schedule for the following reason: if Sun did slow down their release schedule, it will look like Java is dead in the water, and businesses will not have the incentive to migrate to a dying technology.
So, a business will migrate to newer versions of Java (Maybe one version at a time), slowly and at their convenience. I.e. Whenever there is enough money and enough time to do it. Apparently, abundant money and abundant time (man hours) seem to be in short supply.
In the mean time, I, as a Java Software Engineer, will ride the wave of the past, enjoy my present, and look forward to the future of Java. Thus, if you have an interest in securing my help via contractual work, I parley most Java versions - Even, the most expected Tiger version - Which by the time you read this, will probably be "old" news anyway :)
BTW, anything involving Software Engineering and Computer Science, interests me. Even if it's not Java related.
Obscure (or not so obscure) reference to the title: in Pulp Fiction, Mia Wallace tells a "not so funny" joke to Vincent Vega:Three tomatoes are walking down the
street, a poppa tomato, a momma
tomato, and a little baby tomato.
The baby tomato is lagging behind
the poppa and momma tomato. The
poppa tomato gets mad, goes over to
the momma tomato and stamps on
him --
(STAMPS on the ground)
-- and says: catch up.
I also need to attribute the first line of my introduction to Edgar Allan Poe's The Raven.