Sunday, July 02, 2006

Advanced Wire Work with Motor Curve Generation

Hi folks,

It's been awhile and I know I’ve kept the four of you waiting but hey the early bird catches the worm and doesn’t always have time to update the blog. Today’s post is something new, as the title states we’re going to talk about wire work in movies.


I love advanced special FX and this is a topic I’ve long studied especially in the advent of the Americanization of Chinese martial arts movies, such as; “Crouching Tiger,” “Kung Fu Hustle” etc.

Watching these movies is more a hobby than entertainment for me because I have to figure out what I see. Most of today’s movies use a mix of blue screen and wirework to simulate “additional environmental collisions” and the like….. I digress, back to the topic.

Most movies have trouble with one particular thing in the use of wires for fights and climbing, etc.

There needs to be a motor curve worked out for each individual or an active lookup algorithm based on various physical maneuvers, so that differences in “active feedback” on the cable erase the appearance of “floating.”



I’m thinking about moving this kind of thing to a website because I hate not having active image references on Blogger – I have to link to another site – maybe one of you three can tell me how to access images with “root HTML.”

I’m working on a new website and I may try to squeeze in some features that allow for image storage.




This technique is slightly harder to implement though because it requires a “track” to run the engine on, but two cranes with two pulleys connected to either end of an IBeam, tracks can be made for “linear” action – meaning that a secondary beam structure would need to be added to allow for perpendicular movement.

But by adding a second “cam” to the X Axis mechanism it would then be possible to add a third “rotational axis” to allow characters to move farther than contact where contact is the zero point in relative coordinates.

I guess studies like these led to advances in “blue screen positioning of CGI objects” but that would make actors into “voices.”



Don’t know if that would be a bad thing.



Anyway, such a device could be scaled to fit small spaces by using an I beam framework instead of relying on cranes for support, though cranes would help with “aerial maneuvers” like those in the last Matrix.


Blue screening using this technique would definitely have helped with the SpiderMan 2 climax on the subway. I was left flat by their mixture of live action and CGI.



I need to get a good 3D app like MicroStation but that’s a bit much. Guess I’ll have to cough up $600 for one of the lower priced ones – too bad I can’t get the student rate anymore.




As I’m writing this I seem to have “visualized” a more compact version that would enable large groups in one scene on a coordinated group of “motor wire assemblies,” hey that could be patented – anyone got a couple grand burning a whole in their pocket?

The basis of the mechanism is that you have to do tests of “height displacement” using a multi-axis solid bar that registers these differences. Certain movements would use a single bar (only height changes or rotation around the perpendicular axis of connection) and certain movements would use two bars connected for the parallel axis range.

The second mechanism allows for additional points of connection along the body for flight scenes. Since this condition allows – necessitates – more connection points motors can be made smaller and connected in pairs. With a second level of resistance it would be possible to “balance the “shoulder and ankle” connections and move the ankle connection closer to the body and the shoulder connection farther to enable close quarters between two “flyers,” while enabling the same type of “free-movement”

Anyway, that’s the gist of today’s topic. Hopefully, someone will see it…..

Sunday, March 26, 2006

X64 definitely DOA and beyond......

I have to rant right now at every dev house and driver maker and plug-in manufacturer in existence, not to mention the KING OF ALL FUCK-UPS MacroShaft. I waited a LONNGGG time before I made the jump with my brand new AMD X2 4400+ with 2GB RAM - NUMA on my mind - in Oct2005.

I should have waited until hell froze over. From IE32 crashing every 10 minutes AND TAKING DOWN EXPLORER, IE64 having no plugins and toolbars all the way TO MS THEMSELVES NOT HAVE SPECIAL FEATURES ON THEIR $100 Wireles Desktop, At least Logitech did, it was 5 months of a nightmare like being back in Redmond - shudders horribly and that was enough except that my dev efforts at home became severely hampered by Virtual Server R2 not working on X64 Pro, as advertised, which meant I couldn't test Client\Server SOA. Damnit Damnit Damnit.

Others are feeling the pain also as the "push to mainstream 64 bit computing" is, how should I say it, faltering even now. I have always said - after I left of course since there was no need in talking while an employee - that MS should be paying out grants for small comapnies to get things done and maybe get a penny or two per client. After all, you can't get more than 90% of the market. MS is and should be a solid stable release every year or so with new APIs and faster kernel methods with more abstraction.

This is all over the news as Vista Home - or LongerHorn as it's been called - has been yet again delayed for the consumer space. As a Windows workhorse in the time through "The Fall" I can say that it is through a lack of adequate testing of consumer space products like XP Home, I have no doubt that a lot of the functionality needs to be worked on to overcome issues that should have been caught much earlier in the cycle. I mean I can't remember seeing a "Home Network" setup anywhere with XP Home. Even things like System Restore weren't tested with Home - well hey I tried....

And through the worst of Vista's woes, the guys in charge decide that automation is more important than and even supercedes the tied and true methods of Dev\SDET\STE in equal measures dependent on component complexity. What a mess that thing will be in the hands of millions without it. And I don't think this Too little, too late change will do that much to make up for the manhours spent on actually fully implemting Aero into Explorer.....duhhhh, sounds familar. Hopefully the reported level of lockdown will prevent the "I Love You" times of the late 90's and actually run well on the 100s of 1000s of Dell machines that will be sold with it - hey, Dell barely pays for anything, but maybe they should start demanding REAL GRAPHICS from Intel.

As I have toally digressed from my rant about X64 I can say that I am typing this with XP SP2 wondering if I can get my $150 back for MS' attempt at moving to 64 bit. The good news is that my FireFox problems have been solved also, because I switched back in time for IE 7 Beta 2 which is pretty good at not keeping 100s of MBs of RAM while I cruise through "tab threads." Of course that's because IE 7 doesn't have an X64 beta. I'm even able to run other services in the same amount of space as the redundant 32\64 services on X64 with WoW - my systray has 15 items.

Did I say - in my best immigrant voice - Son of the bitch.

Sunday, March 19, 2006

Data Modeling In Windows

Howdy boys and girls, it's time for some code stuff now. This is an article - without the graphics - that I posted on MSD2D.COM a few months ago. Enjoy.





.Net Framework Series 2.0
Data Modeling in Windows
By Christian Howell

Data modeling: the words bring chills to some and visions of 14 hour days to others but with the tools available today such as UML (Unified Modeling Language) XML (eXtensible Markup Language) and CIM (Common Information Model) modeling your data becomes a matter of taking the customer requirements and matching class structures and interfaces to the data types that are necessary. Unlike the old days where a brute force model would work today’s software needs a more structured approach. With the world a few processor generations from “the gang of four” and Managed .Net as the “Center of the Windows Universe” abstracted components are the new keyword for flexible, extensible and secure code.

Properly modeling a consistent UI\Program flow is in and of itself an “evolutionary process” so the term data modeling even means different things to different people. This makes standardizing modeling methodologies even more difficult. There is also the difference between modeling an existing feature set in a new way and modeling a feature set designed from scratch. In this text the term means “developing features such that they can be accessed from multiple sources, from the Native UI to Collaborative services to testing harnesses.” The old paradigm was to gather what needed to be done and just write functions that did it and perhaps handing off UI duties to someone else. Today with feature sets and customer requirements for collaboration and interoperability growing exponentially an object oriented approach is needed to not only limit the amount of code necessary to implement feature sets and make them accessible between app domains but to make the UI easy to use and update.

.Net was designed with these issues in mind and does an excellent job of abstracting data objects and unifying Windows programs under a memory-managed platform; especially in the 2.0 version of the Framework. Of course it can’t handle every case without extension so an effort is needed so that complex objects such as Network Streams become much smaller sub-objects. Such a model might include; port, machine name\IP, permissions, headers, data streams so rather than trying to determine all of the ways you can use a Network Stream you can create XML Schema based scripts to combine the different sub-objects into platform or application specific descriptions. For example, only one feature needs to access the port and machine name while another can process the permissions. Another feature would then decode the headers and stream to determine further processing requirements. Another feature could be used for encryption of returning data streams to add an extra layer of protection. Depending upon the data, security and speed\concurrency needs any one of the accepted patterns, such as State or Strategy can be used to extend the initial program flow. .Net provides native encryption and compression algorithms for text streams and binary streams through the BinaryFormatter so custom strategies are rarely needed for those services.





In this model the 3rd party external client (EC) can then be any module on any machine in a domain or even on the Internet. The server is contacted by the client with a port number and machine name. Since Managed code enables programmers to use declarative security and Windows authentication much of the security overhead can be encapsulated in the calling thread of the client app domain. This abstraction also means that the client has to have access to the pre-compiled server code. The client can be extended to contain the interface for any 3rd party clients that need to have access to the server code. The server code is the middle tier of the abstraction and is needed by any 3rd party client. This also allows for a client\server interface between the entire Network Stream object as described above and any 3rd party tools. As long as the public portion of the request\response feature of the client remains consistent it is possible for the 3rd party client to extend to do more internal processing without requiring new feature requests.

Because of the encapsulation in the server feature permissions need to be correctly applied to the object space before the connection is even attempted. This type of abstraction also allows each property of the sub-objects to be independent of the others so you only need create one Permission object, one Validation object, and one Connection object for the application space. Since the data stream can be any type of .Net stream, this model allows the developer to use one object (class) for most data types since they can be copied to a stream with customizable headers. By creating a header template lookup, several different complex object types can be returned and decoded by the internal client response feature. The 3rd party client is then totally separated from the internal logic of the model. Only the public features in the internal client are exposed and since the data types are known by the 3rd party developer the objects can be extended for application or platform specific needs. This is especially useful for the ever-increasing amount of internet applications. By combining header templates with overloaded method types, all types of database info can be encapsulated between the server and connection spaces while also allowing for local\remote file access between app domains and physical networks.

With the .Net paradigm each of the sub-objects in the model becomes a class under the same namespace (DataAccess). Encapsulation allows that each of these classes can contain smaller objects which handle a part of the processing. This layered (n-tiered) approach means that different clients can access different parts of the model without having access to any other. That is the function of the request feature within the client (DataAccess.Client) space. It sits between the server (DataAccess.Server) and the 3rd party interface. By merely providing multiple overloads for request types, it is possible to control access to any data stream through any connection. The request feature also works in conjunction with the response feature to encode and decode as necessary while verifying thread identities for large numbers of concurrent users.

The server feature formats and forwards requests to the connection feature after validating input parameters from the request feature. This allows that the client request feature has no access to permissions, meaning that the server is isolated from any 3rd party requests. Since all of the methods in the server feature are internal, all calls to the connection feature must be routed first through the public request feature and then be approved by the server. By using a strong name object for each request, high levels of concurrency can be achieved while maintaining data integrity for each request. The server uses a queue to manage requests and responses. This queue contains request-specific information for thread coordination with the client request\response feature.


The connection feature is the final segment of code and is responsible for processing the headers in the request and retrieving the data stream from storage or creating a new entry. The data can then be encrypted for return. By defining your requests with text scripts it is then possible to have requests come from multiple sources including the Internet for easy transfer. In this feature the emphasis is placed on speed rather than security. This allows optimization of this module without affecting the security of the response feature. This feature handles any external storage interfaces such as SQL databases or XML files by simply overloading access methods based on the header processing. This model can be easily extended or adapted to handle different types of application models. This feature is the most complicated since it has to be coordinated with the design of the storage medium. In the case of databases the developer needs to work well with queries and stored procedures while a file system access application needs to handle NTFS well and some apps need to deal with both while handling transaction concurrency.




The key to this type of model is that “most” usable patterns have already been discovered and can be extended as necessary. Most of these patterns are based on the common File, View Edit, Tools, Options, Help environment (The standard Windows Menu\UI paradigm). Of course, it is never a good idea to try and write initially to a pattern, since the differences in application features and requirements mean that in one case a State pattern may be more efficient than a Strategy or Factory pattern for two apps that perform similar functions. When modeling data for consumption and display the key is to remember that any data can be described using a combination of native .Net types and that the description of the data is ALWAYS more important than the features that use it. In many cases personal or financial information is consumed and must be protected by the interface. By ensuring first and foremost that the data remains consistent throughout the process refactoring will then be useful for optimization. The feature set will then expand as testing of current features continues. This is known as an evolutionary design cycle. It means in essence that you should always keep your code simple and always design your features with testing in mind. Some people consider this method to be “designing to the interface and not the implementation.” Another way of saying this is the user doesn’t need to know the details only the data. For any object space overloading the public entry points enables different types and amounts of data to be processed by the same internal server. By keeping with the abstracted component methodology, you will avoid creating complex methods that don’t allow for high levels of granularity with your object space.


Tools such as NUnit (www.nunit.org) give developers a way to test their features individually or as a live client. Script languages based on XML schemas or UML are much more efficient because they have no code overhead. The same parser that is used for the client scripts can also be extended to include test parameters and environment settings. With tools such as NUnit you need to adhere to the format set aside which sometimes causes increases in the amount of code necessary to determine success of a given test case. It is of course possible to plan for using these types of tools through a script\parser interface but again the idea of modeling is to limit the amount of code you have to write and maintain. Component-based scripting does this and more. It enables “cut and paste” editing, ease of storage, no need to recompile to add new requests. Of course adding features for processing the data in scripts requires new code and schema elements but this type of model means that new features are separated from existing features and lessen the chance of regression failures. This abstraction also enables you to make tools based on a subset of features; such as, setting up initial environments, creating database tables, create web pages using XML\XSLT, or viewing XML documents. All that is needed is a custom client space. Below are listed the basic data objects necessary in each object space of the model. These are determined by either writing out a paragraph or two describing the necessary functionality or the data that needs to be exchanged. The .Net Library 2.0 contains advances in C# such as anonymous methods, which allow “inline” delegates; iterators, which add the “yield return” and “yield break” methods to reduce amount of code necessary for base collections; nullable types, which allows value types to assign null to the type instance; generics, which allow templated base classes for collections of any .Net object type. Look for coverage of these new features, coming soon.

Client Data Objects

HttpWebRequest
HttpWebResponse
WebRequest
WebResponse
XmlDocIn
XmlDocOut
EncryptionKey
EncryptAlgorithm
RequestQueue
MemoryStream
SecurityPrincipal
ThreadPrincipal
RequestType – complex



Server Data Objects

PortNumber
IPAddress
IOPermissions - ACL
WebPermissions - SSL
XMLParser
HeaderLookup
ValidationRegularExpressions
WebService




Connection Data Objects

HeaderBlock
AccessPermissions – ACL\Thread
EncryptionAgorithm
EncryptionKey
NetworkStream
FileStream
XMLFactory

Saturday, March 18, 2006

The History of American Politics and other nonsense

After watching The St. Patrick's Day issue of Bil Maher I am in the mood to entertain the 4 people who have read this blog. Political races in this country have debilitated into a contest to see who can get their head farther up the asses of Corporations. Corporations who have a vested interest in keeping the money away from most people; who have a vested interest in the corraling of cheap labor in our nation's prisons; who have the ability to get oil loving idiots elected through carefully placed propaganda. The list goes on but I don't have the time or desire to.


This country claims to be free, but as Bil Maher pointed out more supposedly "less FREE" countries have elected female leaders. The female politician on tonight's show was more like Bush in drag, spouting the party line on every subject from policing the world while this country swirls around in a quagmire of legal and illegal drugs to beleiving that the other guy is worse so we shouldn't believe in them.

The war in Iraq is more like a show of patriotism than a necessary factor in the "war against tyranny." Is it more tyrannical to tell people to shut up and lock them up if they don't or to tell people they are free and fire them from their job for having an opposing viewpoint to the current idiot in office?


To have a politician actually count soldiers - who by the way are SUPPOSED to get in line - as good examples of support for ANY WAR is asinine at best and dangerous at worse. Don't get me wrong I spent time as an AIRBORNE PARATROOPER ASSIGNED TO THE 82nd in NC so I believe in the sanctity of being shot at for your country. But when it comes down to it, I would rather have the masculine lesbians take my spot in the lineup. I really don't like dodging artillery rounds and roadside bombs which is why I went with a geek trade.

I can say that I am always amazed by political discussions. Even when they are based in humor. Tonight I found out that most people are assholes BECAUSE they are idiots that can't figure out how to NOT be assholes. We talk about how much we are doing for these other countries who don't have a history based on the oppression of a WHOLE RACE of people while even today if left to their own devices coroprate structure woul dREMAIN OLD AND WHITE. Even in Nazi Germany, they didn't have signs that said NO JEWS. Sure they gassed a bunch of non-Aryans but they didn't gain power on their belief in the "INALIENABLE RIGHTS OF HUMANS." This is not to say that I would like to live somewhere else but that is not because of the "freedom" in this country but because I was born here and don't want to learn a totally new culture and language.


Comedian Richard Belzer surprised me with his view on political affiliation. He actually realizes that Remocrats as he called them are just the same assholes who have gotten the richest country in the world into 9 TRILLION DOLLARS in debt. And then they propagandize the lack of frugality of spending in the black community. Wow talk about BULLSHIT. No wonder I think people should only vote for what color toilet paper their cultural/ethnic group should adopt.


Journalist Michele Mitchell chimed in with the fact that the Iraq war is nothing like WWII, where the country who ACTUALLY attacked us was in turn retaliated against. At this point there have been upwards of 25000 casualties in this "war" with no end in sight. This has seriously depleted our number of active COMBAT troops. Most people don't realize that seriusly injured troops don't return to combat duty for weeks, sometimes months. Me personally I am a firm believer in fighting your own battles. There would be a lot less wars if these old greedy MEN - usually white - would just fight with each other if they don't agree, but no they would rather have some person who probably doesn't even understand the purpose get out there and put there lives on the line. Or better yet if they would STOP believing they are ANY BETTER than the others we could all work together on something more than a platform in orbit that serves little purpose.


That brings up this point of technological superiority. Americans are stuck on this as if the average one of us will ever invent, create or discover anything other than the fact that their children are doing drugs, their toilet is stopped up because of their HORRIBLE DIETS or that the burning sensation they feel is not love. I personally think that as technology increases the average American whether native or immgrant gets dumber. This extends to the rest of the world but at least they don't have a few psychotic "scientists" who create horrible weapons to convince people of their MORALITY and righteousness.


That brings us to the MAJOR thing I disagree with Bill about. Intelligent Design. It's been in the news and even commented on by yours truly but in his mind is just a fantasy. Sure, I don't expect to see a "Creator" or a picture of him, but then the world turned out to be round and the Earth turned out to NOT be the center of the universe, so technology DOES make things possible. Of course the fire and brimstone preachings of sometimes barely literate theologians who are so in touch that they think some Creator would tell them to NOT HAVE SEX doesn't help. They can't even keep that. They are like those rats that become gay when kept away from the female of the species - a totally HUMAN IDEA by the way. IF left alone mice would be happily bangin gtheir "women" rather than becoming so horny that they would stick their dick in anything.

Anyway this is getting kind of long and I'm going to stop now but "I'll be back."

Sunday, February 26, 2006

Microsoft Cranial Expansion

The natives are out in force at MS as the sky comes back out in Redmond. Tempers are flaring as yours truly tends to incite people by knowing what I am talking about. It is actually amazing the level of reasoning existent - or non-existent - at MS right now. I know that Redmond makes you dizzy but give me a break.

Some believe that MS has better products then everyone else, some believe that MS is better off getting more layers consisting of thousands of "no-birds." Some believe that the antitrust troubles are the fault of others (?). I am always torn between feeling bad for the consumer and feeling bad for the competent cogs, and so I tend to comment when it is obvious the comment is either from a mgr who is the bane of efficiency or a "hanger-on" who feels bad that he really is a "warm body."

At any rate, I will continue to speak my mind regarding the MS that is and the MS that could be no matter who has a problem. The latest slate of comments is a sign that Vista will ship with at least 100,000 bugs and I don't think I would use IE 7 if my life depended on it( and Firefox's memory leaks are REALLY annoying).

Especially when disruntled assholes are telling me they know I couldn't possibly have a computer at work and at home and be trusted enough to not have an Internet blocker. Once again Mini in his infinite wisdom suggests that maybe I should use the anonymous option when I post so that mighty idiots aren't offended when they see "TheKhalif(maybe I shouldn't have posted my pic)." He actually thinks maybe I should talk more about MS here but that won't happen. This blog is called "Faster Than The Times" which basically disallows too much talk of the "Ever-Lumbering Juggernaut of Redmond."

Besides, nothing will ever change as long as Lord Bill is the Chairman and Sir Steve is at the helm so why should I burn cycles on MY blog talking about what they could, should or might do when I can talk about cool stuff like magnetic ICs and advanced nuclear power sources?

Anyway as I always say I wish all the guys at Redmond the best but since I know MS isn't going anywhere unless the Cell is used for everything in a new PC that has an OS that plays game and supports domain security I know they are screwed.


The funny thing is that all of these white guys now know what it's like to be treated like a black man - being told no matter how good you do there is some problem (can you say stack rank?).

Wednesday, February 15, 2006

The Immaculate Conception

Today we want to talk about something that has long concerned us. It is something that most don't believe in but yet it seems to be a real part of life in America. I can see the 4 people who have read this blog saying what the hell is this fool talking about? What does the immaculate conception have to do with America?

The answer is simple. I can bet you have seen this occur at least 3 times in your life. It occurs everytime a Hollywood attempts to integrate the movies. I'm sure most of us have seen the film "Phenomenon" with John Travlota, a whole cast of white people and Forrest Whitaker. At least Diana Ross was mentioned, even though it was as some untouchable creation off in the "ether." How is it possible that a black man could misraculously appear in the middle of BumFuck, Wherever USA and not have a parent, sibling, or "culturally-equivalent" compatriot?

Again, the answer is simple the unending quest by the forces in Hollywood to minimalize the importance of minorities in "their" world.

Other examples of the "IC" are Star Wars, both Billy Dee Williams and Samuel L. Jackson portrayed a "racial-variety" that possessed the inate ability to spontaneously reproduce, the recent werewolf flick "Cursed" with Christina Ricci, a whole cast of white teens and Mya, playing the dual role of the first black person to get killed and the only black person in the movie.

This phenomenon seems to occur with greater frequency as more rappers end up in feature films. Take the epic remake, "The Flight of the Phoenix" where "StickyFingas" and Tyrese play the role of the endearing yet obedient and clinging black men who seem to never find a woman of color nor have any mentionable family. At least the rich translator was rich.

Other notable examples include the poignant drama "LA Confidential," starring a particularly "insensitive" Kevin Spacey, yes I know there were 4 blacks in it but they all seemed amazingly detached from the reproductive process as no black women were anywhere to be found. Sure it was about "upscale callgirls" but hey what better way to belittle the whole race than to have a black girl with a sad story who attracts Jewish men?

Following closely behind is the campy yet endearing "Resident Evil: Apocalypse," starring Milla Jovovich, a large beast, a cast of white people and an unknown black character actor who dies badly.

Then there's the touching "Ladder 49" with Joaquin Phoenix and John Travolta (I guess he can only work with one black on the set at a time), where Morris Chestnut has an invisible family who is never seen even while he is recoverng from a melted face (compassion thy name is not Hollywood). Jeff Goldblum stars in the comical romp, "Holy Man" where Eddie Murphy apparently lived with a woman who was either a real ho or practiced in the art of asexual reproduction. John's talents are again tapped for "SwordFish" a movie that amazingly has a black male and female - even though they never interact or appear in frame together - one of whom even survives through the end of the film ( amazingly Don Cheadle is one of the only black actors to never play gay or "wimpy" part. He even has so far avoided the interracial role, though he came close in "Rosewood").

Another new entry is "Paparrazi" (does anyone know how to spell that?) with Cole Hauser (who?), a whole lot of white people and a cameo by Chris Rock as a pizza guy who complains about police harassment (heaven forbid). "The Edge" a tense thriller starring Anthony Hopkins reminisces about the last black guy getting eaten by a bear (it hints that the rest died like "Ice Age's" dodo as Oz's Harold Perrineau nearly removes his leg trying to make a spear - wow at least that kills the spearchucker moniker). Clint Eastwood loaned his talents at least twice to the phenomenon in the films "Heartbreak Ridge," where Mario Van peeples has token duty and the Academy Award winning "Unforgiven" as obviously no other blacks made it through slavery so Morgan Freeman had to settle for a mute Indian squaw - who appears as the only other race to exist at the time). Speaking of Unforgiven, I guess blacks should be glad that at least we appear in some "mainstream" movies as something other than "cultural comedians" like the ever present lone chinese cook or the quintessential hispanic criminal.

Many other examples of this amazing phenomenon exist but the list is so long that it would take until tomorrow.
SO the next time someone questions the immaculate conception direct them to the Box Office.

Tuesday, February 07, 2006

High Definition Images

Welcome to the next in our series of technology articles. This next one is actually one of my favorites as the dissemination of visual information is a key to many of simulation routines. Anyway, way back in 1988 I was looking at image reproduction and graphical progression. It is an interesting topic as it is much easier than logic circuits. You can acually fake a lot of the facets of images repro.

For example, from a distance perception of images starts to change and by the same token, proximity can also cause perception aberrations. SOme people see better from far and some see better closer. Also, the amount of absorbed and refracted light can change this perception even more.

When looking at the various methodologies used one can see that sometimes Technicolor outshines ILM. After lookig at the various types of surface that can be used to "imprint" digital or ananlog signals representing images, the most efficient for reuse seems to be Au. Modeling CrO tapes and 8mm film, one can see that the adherence factor of the material to its base is a very important factor.

Of course, another major obstale is the "imprint" method. The CCD method uses a series of transceiver devices to record more than just the amount of light. By using PID tech it becomes possible to record luminance, chrominance, etc in the same data stream. This enables digital manipualiton by allowing each stream to be adjusted there by changing or "editing" the picture.

Multiple pictures can then be sequenced and displayed in order which actually creates video. ANother breakthrough was the use of colored lenses and high energy light. By tuning the base frequency of the imprint material it is possible to actually reject any frequency of light or increase either of the underlying streams. Physicists have recently found that intense fields can even deflect light. Using phase angular field gemeration it is even possible to "morph" objects merely by adjusting various fields in the vector space.

This enables even greater flexibilty in editing.

Oh well, gotta go.

Vaporware, we don't need no stinking vaporware!

Hello again from IT Central as we present another rousing episode of "The VaporWare Awards." Actually thi sis from Wired but what the hell as long as I don't copy and paste.......

The runner ups this year are varied and kind of expected with the usual suspects showing up, like Duke Nukem Forever, the perennial favorite from 3DRealms. I can't believe that is still being developed. There was a showing (well, not really) from TiVo who are still promising cable card HD. Blu Ray and HD-DVD also showed up ( or didn't) as 2006 is here with nary a disc in sight.

Next up wasn't The new Legend Of Zelda for GameCube and Team Fortress 2 from ( almost) Valve. Afew other faces weren't seen but this year's (or last years) big winners were (trumpets please) Microsoft and Google.

Microsoft has perhaps defined vaporware with the nearly nonexistent Vista and it's maybe not so friendly IE7 sandbox buddy. There were no public betas of either product and the buzz has died down to a very quiet whisper.

Thsi is totally opposite Google who have been given the distinction of never not being in beta for their major 2005 product offerings. Check out this link for full coverage.

2005 Vaporware

Sunday, February 05, 2006

Ionic propulsion

The latest in our faster that the times posts emphasizes ionic propulsion another of 1989s many breakthroughs. It started as a high compression turbine chamber, but the simulations run tended to show that it is possible to do nearly the same thing with hydrogen ions. By applying poles to the turbines and an intermediate opposing field, the introduction of ions into the vector space allows for "magnetic compression."
By pulsing the intermediate field while also transversely pulsing the oppsoite turbine poles, high energy ions can then be propelled from the vector space applying an external force to the chamber.

By using a large enough ion emittance surface it is then possible to mimic the thrust applied from actually combusting hydrogen or even heavier nitrogen (though on a lower scale). WHile modeling the propellant force achieved through combustion, I began to look for ways to actually "ignite" these ions. Simulations showed that certain em frequencies can be used to apply fields to the ion clusters to increase their motion.


At that point, I had actually modeled extraorbital propulsion, which works slightly differently than atmospheric propulsion because of the tremendous forces fo gravity between bodies and the lack of volumetric pressure. Probably the hardest thing was to decouple from the ion source. It was a real bitch too. Funny enough that work led to the big deal.... teleportation. Actually it was a combination of that and laser work, but we probably won't talk too much about that.

Magnetic Bearings

Next in our series of techno-speak, we have (trumpets, please) magnetic bearings. AN interesting technique that prevents larger mechanisms from needing more lubricant to coat moving connections. The technique came to me in 1989 ( boy that was a busy year) while I was working on axles for personal transport units. Since I have an aversion to grease I had no choice but to realize that by compressing two magnets very close together it is possible to suspend the moving portion on a localzed magnetic field. The effect would be similar to a mag-lev (magnetic levitation) device. Since the two parts don't actually come into contact, the design virtually eliminates wear of bearings, enabling more efficient designs.

Using electromagnetic sources also larger structures to be desugned without complex manufaturing processes (Try smashing together two 100lb rings that are the same pole).
This comes in handy for "high compression turbine chambers" as propulsion turbines tend to spin at 10s of 1000s of RPMs.

Monday, January 30, 2006

Happy Birthday to me!

Here's to my best friend and role model on his 41st birthday. Sick fuck that he is. Happy B-Day ME!

Sunday, January 22, 2006

Yah, Mule, yah!!

That was the cry from SteveB at MS' latest Town Hall meeting (if only he was as cool a Clinton). He implored the employees to put out more effort since it appears that MS does better when employees work harder. (Boy talk about blinders, no wonder I didn't feel THAT bad about leaving) He of ocurse made no mention of the problems talked about on MiniMicrosoft

As I thought, or began to think months ago, those rich guys in charge don't care one bit about the rank-and-file. They see lots of value in maintaining the StackRank StatusQuo obviously. I'm not seeing it whe you consider MS is losing people like MarkL on the dev side and, not to blow my own horn, but me on the test\sdet side). My latest automation harness consisted of a proprietary script language.

Anyway, it seems like the LisaB Listening Tour is a pacification technique with the highlights given by Mini (I can't bring myself to say Who'da). It's unfortunate because MS is in the unique position of driving innovation but they seem to be stifling it. Companies who could enter a market fear to do so because if they do well enough MS will "come after them" and the best they could do is sell. That's not right. There may end up being a problem when Vista releases. There are so many people saying that it won't be wrth an upgrade. With the enthusiasts becoming more and more vocal, Vista may just lay a Pyrite egg.

Of course, I loved MS and I don't hope that happens but with my experiences I won't be sad if some of them don't have several billion dollars to dangle in front of employees and prospective employees. i guess sometime next week we will see if anything is implemented to turn the tide of dissent currently threatening to drown MS' War Chest. I wouldn't bet on anything changing, though. MS obviously has to protect themselves from something, though I don't know what could be so powerful as to cause this kind of panic.

In my opinion, twisted though it may be, the idea is that someone had to make sure that the money stayed in certain hands and away from others.

Friday, January 20, 2006

The Return Of Intelligent Design

It's back in the news again as yet another school district has been sued into compliance with the general rule that you can believe in a "Creation" but you cannot try to use science to understand it. Instead thoughts of the Creator are limited to magical theorems of Lucky Charms happiness. I find myself offended by the ignorance of people. Now we're being told that teaching Intelligent Design is a violation of Church and State but that has been violated by the rules of marriage forever. Why do you have to get officially licensed or live together for a State-set time before you are able to be recognized by the State. Also, by denying gays the right to marry again we are violating the policy. There is nothing in the COnstitution about marriage being between a man and a woman. Don't think I am for homosexuality but it serves the purpose.

Anyway, the class in question in El Tejo Ca was taught by a preacher's wife and theorized that the inherent complexity of life implies that there is some sentience involved. How can science be worth anything if scientists still have no clue as to how life was actually formed? I think that we should stip living in the 10th Century and start to embrace the idea that evolution itself is a part of Intelligent Design. Psychologists, con men and politicians can manipulate minds but a mind can't be Created? Men can crack a genome but there could be no consciousness that could order a double helix? It's amazing that AMericans are so conceited and most of us are barely literate enough to properly use a VCR clock. These are the same Americans who are graduating from college without the skills they should have had when graduating from HS. A recent study shows that teh average college student has a lot of trouble with complex reading and math tasks. I guess a lot of us weren't designed to be that intelligent.

Looking at Genesis, the impliocation is that all life forms came from giant whales. Amazingly enough I never saw the part that said "And God created great whales" until yesterday. I was watching a Bill Maher comedy special and he mentioned how whales and snakes mentioned in the Bible are some kind of mystical wonder rather than people who had never seen advanced technology somehow seeing into the future (perhaps through the use of hallucinogens). ANyway, I actually submitted that as a theory 1n 1992 when I was in Jr College. The theory has come a long way since then and I can almost "mathematically compose" elementary particles. The big problem with trying to recreate these particles is that before large bodies of matter and anti-matter interacted, different particle interactions were possible. I am also of the opinion that black holes are the product of high energy particles feeding back on a source after a given equilibriumm was achieved. Of course this equilibruium is dependent upon a large enough area.

It is also interesting that only one planet in the solar system will support any life. It seems as though a random occurence in such a large area would be reproduced if certain physical laws weren't "applied" to the vector space. Of course it is almost impossible to produce soem type of picture of an omnipotent being but according to the way Americans act an omnipotent being would be an average white guy. I think the point is not to prove or disprove the existence of such a being but to perhaps become such a being. Who knows, but the BIble promotes such an achievement. I thinkmost people's problem is the lack of intelligence that comdians joke about and the media displays. If people were more willing to learn rather than live off of some "genius in numbers" theory perhaps there would be less violence and depravity in the name of the Most High.

Whether or not anyone believes in the Creation won't make it go away.

Wednesday, January 18, 2006

Magnetic Circuits

We're back with a new tech post and this one is a doozy. Most people have never heard that term before. (Check the description of the blog) To those with the slightest bit of technical acumen you may realize we're referring to the use of magnetic energy to operate gates and switches in a microprocessor. I have seen more effort in this area and from MIT and Cambridge I believe so I guess I am in good company. I first looked at magnetic circuits in 1989 ( I think I looked at everything in '89) The idea actually came from work with magnetic bearings (I hate grease, we'll save the bearings for another article)
Anyway, the way they work, at least with the implementation I used, is that rather than dpoing with n or p Ge or Si ions, you actually dope with magnetic and restrictive magnetic materials. By placing juxtaposed electrical fields on either end of a block, you can then "program" areas to excite "electrons" so that only desired areas of the circuit block will allow the passage of "data." Then you must poll the path to determine which areas ( divided by potential) have an increased potential due to flow. These are then your "bits." You can then add buffer areas which have a range equal to the "attached" bit areas. Using a crystal clock that vibrates at high rates, one can transfer up to

"Clock rate" x "bit width" \ "word length" instructions.

Research has shown that applying varying magnetic fields and elctrical fields to semi-permeable materials, nanotubes (small rows of atoms shaped into a cylinder) have been shown to orient in a p or n formation.

Similarly providing a linear segmented vector space, it is possible to form complex words in one pulse of the clock rather than many. Since all data is composed of sequences of bits, by abstracting the various color and alpha levels for graphics it is possible to define parts of the stream as "pre-rendered" structures like text. B creating parallel structures with varying "signal" ranges, multiple streams can be processed simultaneously.

Minimization of length and width space per "logic bit" and lowering the total power input it is possible to maintain a linear relationship between the bit values. This can significantly lower voltage requirements and allow faster clocks or more bit regions with similar power input.

More later...

Tuesday, January 17, 2006

Long live the model number!

Today's topic is the death of the long-standing brand name "Pentium." Intel is dumping the moniker after over 10 years and countless changes. It's interesting why they chose the beginning of 2006 for this move, but considering the pounding the Pentium has been taking at the hands of the Athlon brand, maybe it's not so strange. Intel is busy hyping their new "totally inexplicable" ViiV concept. Even major CTOs don't know what it is. Hopefully it will do something to overcome the hard times of the past few years, or maybe I should say hot, slow times.

At any rate the "P" is dead and now Intel will be using the model number scheme to confuse everyone even more.
Maybe the name wasn't so bad, even with the "divide by zero" errata that would have sunk AMD and the stall at 1GHz for the P3 or the totally useless Willamette P4 which had more stages than Broadway. As AMD leaned on efficiency the Pentium got MHz faster and Celsius hotter, with the fastest models taking up the average 350W PS by itself.
But it still sold like the proverbial hotcakes you could cook with them just through brand recognition and consumer ignorance.

Now with the new Yonah and soon to be released Conroe checking in at 65nm, the oven has been turned down enough to attract Steve and Co. whose new MacIntelTosh has sucked up the intial run of these new Intels. Apple stated last year that Intel was closer to their long terms goals, which I guess is synonymous with Motorola was melting the IMac and AMD just goes too fast. Plus they may not be able to provide us with the 2-4 million boxes we sell every year.

We will see how well they do in a few months as I'm sure that every PC reviewer on the web is itching to get their hands on them. Preliminary tests show that the Yonah is up to 65% faster than an equally clocked Dothan (Pentium M), but there is no direct comparison between the G5 and Yonah. Hopefully, this will be Intel's year to actually look like the major CPU producer with their performance and not just their marketing.

Saturday, January 14, 2006

Differential Adaptive Light Masking and Integral Transform Extrusion

We have had enough posts about MS and the computer industry, now it's time to talk about some fun stuff. What kind of title is that do you wonder? Well, I just got finished watching Revenge of the Sith for the 5th time on DVD and it reminds of my heady days as a "probably never gonna get paid for it" FX designer.

Back before NBA Live, I thought the best thing to do would be what is now called "skinning" or the process of applying an outer texture combined with an inverse kinematic derived from a filmed subject. This is then placed onto "body" matrix consisting of the vertexes ofnthe human body ( fingers, knees, elbows, necks, etc).
Nowadays, movie or pre-rendered graphics can almost create lifelike faces. Almost. The biggest prblem facing FX designers i show to model skin textures. Even though skin appears smooth, it consists of millions of small irregularly shaped "patches" placed between millions of nonlinear vertexes. As lisght shines on skin at different camera radii, the amount of light absorption and reflection changes as per the billions of data points existent with these huge irregular matrices.

This problem is compounded by the flexible nature of skin which causes lines and wrinkles when certain movements are performed. How will it be possible to then show the difference between a freckle at 180 degrees from camera and a freckle 90 degrees from camera? The answer is Adaptive Light Masking. Rather than trying to force the textures to mimic the extremely complex bump mapping of a freckle, light sources can be differentated over a complex surface using simple ray-tracing. I this way, it is possible to show the difference in shadow between a young person whose skin is tighter and an older person whose face is showing the effects of weather and exposure to the sun.

Hairs are even more susceptible to the complexity of the skin model. As hairs are shorter they behave differently as per shadow then longer hairs, which can be used to cover impending absortion differences. Even complex Alpha blending becomes too "plastic" in close up views as is evidenced by certainfight scenes in the Matrix where it is clear that "Neo" is a CGI construct. ILM has done a fabulous job though with their blend of live action and CGI. It seems that they have found one of the secrets to FX - "fake it, as long as it looks like what it should it doesn't matter how you do it." Take for example the creature ObiWan rode, it was much improved from the one Anakin rode in Attack Of The Clones. It looked as though a "mechanical bull" was used and then skinned onto a CGI extension. Or on Kaashyyk when the light didn't move along the helmet as the Storm Trooper turned his head. There were some points where they didn't use integrals for the legs of the Troopers and they seemed slightly cylindrical. Even the complexity of Yoda's face had to be tempered by the use of a rougher texture. This is necessary because it is very difficult to create light that perfectly filters like sunlight. Many FX shops are now using overly-muted tones and heavy "greenish-brown" filters to even the distribution of light between CGI generated light and sunlight or "actual light."
I have been looking at using colored lenses to filter light and so far the concept is proving sound. Maybe I'll make some money at it yet.

Revenge of the Nerds?

It looks like it at Mini's place as the employees, anonymous though they maybe, are coming out in force to "rag on the rich." It almost seems like an uprising or the prelude to a coup.

Damn partners


It gets really heated as the eternal "Dev vs. Test" argument came back from the dead. It was exacerbated this time by the authors question to MSers ( of which he was once ) about the initiative to change over from STEs to SDETs. Sevral current SDETs chimed in with remarks such as;

Metaphor:
As an SDET at Microsoft my job seemed to be driving over a bridge that is being built from both ends and in the middle across a pool of quicksand during a hurricane.

Telling people they're building in quicksand will get you blacklisted or looked at like you're crazy. Everyone complains about the rain, but nobody talks about how the wind keeps blowing everything over. What they want from you is trendlines on graphs showing that the concrete and the cabling is secure and a lot of entered bugs about how the tensile strengths of the cabling, places where the paint is mismatched, or the concrete is chipped. You alternate between snickering and sobbing when the middle of the bridge sinks another foot. Every once in a while you watch them lower both ends of the bridge to match the middle, and you yell uselessly that you think the ends are pointing in different directions.



and;


This is another sad story. Some moron from high-up, decided to get rid of all the STEs in the hopes of replacing their manual work with test automation. Predictably this initiative has failed. Big Time!

Now we have a test team WITHOUT any testers. Yeah! Don't blink your eyes. You read it right. In the absence of STEs/Testers, we (the SDETs) find ourselves devoting more and more time to Non-Automation related work - manual testing, bug verification, bug closing, lab installations yada yada.

To answer your question - my responsibilities are not clearly sketched. Am I a dev who writes test automation code? Am I a manual tester? I do not know. Go ask my manager.




Scathing indictments indeed as yet another initiative is shot down from the inside. How can people be passionate if they are being abused? WHat amazing event could have occurred to force mgmt to screw everybody? Interesting questions, but no answer is likely as the billions of dollars generated have created billions of obstacles for the countless future employees who may have different social\cultural viewpoints as the current mgmt. Even reasonable arguments presented by those who disagreed with my viewpoint bolstered the opinion that SDETs are not STEs are not SDEs;


That's a bit much- you have nothing to perform QA on without code being written.

That being said...people are nuts if they think writing automation code equates to being a good tester.

http://www.joelonsoftware.com/articles/fog0000000067.html



Looking at this link shows the type of problem that plagues MS right now;


My first real software job was at Microsoft; a company that is not exactly famous for its high quality code, but which does nonetheless hire a large number of software testers. So I had sort of assumed that every software operation had testers.

Many do. But a surprising number do not have testers. In fact, a lot of software teams don't even believe in testing.

You would think that after all the Quality mania of the 80s, with all kinds of meaningless international "quality" certifications like ISO-9000 and buzzwords like "six-sigma", managers today would understand that having high quality products makes good business sense. In fact, they do. Most have managed to get this through their heads. But they still come up with lots of reasons not to have software testers, all of which are wrong.

I hope I can explain to you why these ideas are wrong. If you're in a hurry, skip the rest of this article, and go out and hire one full-time tester for every two full-time programmers on your team.

Here are the most common boo-hoo excuses I've heard for not hiring testers:




Wow, no wonder the initiative isn't working. Many devs and other feel that test is just an annoyance and that testers don't need to have certificatons that production people have;


Amazing. I thought that kind of hubris was limited to devs.

Test (or QC, rather) does not determine the original quality of the design or code nor does it fix defects found. Test also does not make the final go/no-go decision (or should not, unless we want responsibility for any quality issues that we had no part in causing in the first place!).

All we really can do is evaluate the product, report quality issues and suffer quietly in endless test pass cycles at hands of the failures of the PM and dev teams. (At least, until we can transform test from QC to QA+QC and shoulder part of the responsibility for the creation of the product ourselves.)





Of course, this didn't go on for long before mighty "Who daPunk" himself chimed in and requested that the "Test vs. Dev" argument shouldn't be awakened. I for one have done both and know whereof I speak. I even tried to do both at the same time but found that it IS NOT POSSIBLE;


The comments are sort of getting into a back-and-forth unproductive state. Just when I was thinking about a new post of dev vs. pm vs. test.

TheKhalif: now that you have your own blog back and running, I recommend you follow-up there regarding your thoughts and continue the conversation in your comment stream. Feel free to post the URL here should you decide to do so.

Or if you link directly to this particular post, it should automagically show up at the bottom.




Et voila. I wondr if he'll let my next post go through. Probably not, but one individual seems to place some importance on my comments at mini's;


Making devs do MCSE exams is about the dumbest thing I've heard. All it would achieve is in slipping ship dates. It's like making a Boeing wheel engineer pass a 747 flight exam. The pilot doesn't know (or need to know) the inner workings of the wheel, nor does the engineer need to know how to fly the plane.

My increasing feeling is that TheKhalif is to Mini what Christopher Coulter is to Scoble.
It's all about the signal to noise ratio... Some day we'll get an RSS feed enabling you to blank out the usual suspect comments. But right now, I'd give my lunch for a feed on blogger to blank out comments I've read before. Deja vu is a very common experience here...




God, I love this job. And to think I almost missed all of this by going into Mechanical Engineering. In 5 years I have gone from a guy who kept getting Runtime Error -5 to being able to create script languages and actually determining when a State pattern shoud be used vs. a Strategy, vs a Builder.

Anyway, I don't really expect a lot of movement on this blog, it is just turning out to be at least a little fun. A couple of posts ago I even got my first troll. He proceeded to tell me how much my content sucked and the like, but at least he couldn't say I didn't know what I was talking about.

As one of favorite lines goes, It's Good to be the King!

I don't expect the Test vs dev thread to go away for awhile and it will be interesting to see if anyone picks up on how big a problem such a state really is. I mean as I said before, "In the end, everyone's on the same team."

Et tu, VooDoo?

This is the new saying as 2006 starts. Yes VooDooPC has answered the FX clarion call and the new FX60 from AMD has finally gotten the attention the line has deserved since 2003\4. On his bog the founder is extolling the virtues of the AMD architecture and is talking loud about a new AMD-based VooDoo laptop that will, be the fastest thing out, hands down...

...and shortly VoodooPC will be launching the world's fastest notebook - no question, uncontested, no one will touch it. Guess what processor it's running?

Et Tu VooDoo?

Several Intel supporters visited his blog and made some scathing comments which were quickly dismissed as the inherent superiority of the FX60 has taken the performance world AND the multitasking world by storm. Clocked at 2.6GHz and packed with two cores of DirectConnect goodness, the FX60 is nearly on par with the FX57, it's single core brother clocked 200MHz faster. Of ocurse, when it comes to multi-tasking the 57 is blown away by the 60 as the X24400 and above are starting to show their usefulness in real world scenarios where people actually have Flash-ridden Firfox tabs running in the background while they churn out Doom 3 frames or build their latest ASP.Net module.
I personally can't believe that VooDoo has finally started to take AMD seriously. Now all AMD needs is a Tier1 who builds for application developers who need more RAM and less graphics support. The 6150\430 from NVidia has finally provided a "more than good enough" on board graphics solution with support for 4+ GB of RAM, so devs can now actually develop for platforms using the SAME AMOUNT OF RAM and SATAII actualy provides enough HD bandwidth to mimic a real world web server. After all most companies are still using non-EMT64 Xeons. One FX60 with 8 GB RAM could "Virtually" replace 5-10 of these machines.
Dell really needs ot get with the program. Recent Virtual Server tests (www.anandtech.com) show that Dual Core AMD chips are able to keep their performance (relatively) up to 8 virtual sessions using VMWare ESX. The 955 is doing well but with Pacifica and DDR2 coming soon, Dell with be drowning in the deep end with Tier1s like Monarch building 4 proc monsters with AMD. Even AlienWare has now picked up on the "Good News" and is offering a full suite of FX and Opteron boxes. George Lucas realizes their potential as the "beautifully- FX'd" Revenge Of the Sith was created with Opteron farms.
2006 is indeed looking like a good year for AMD even without Intel's anti-trust issues and nuclear-powered Dimensions. It will be even better if ANY major brand; HP, Lenovo, Gateway, or even heaven forbid Dell begins a real push to get AMD to the masses where they belong.
As a disclaimer, I can say that I have owned several Intel-based systems back when Via was the only "real" chipset choice for Athlons and AthlonXPs. Intel has simply dropped the ball and in my opinion used despicable tactics to keep AMD64 out of the boxes of major corporate manufacturers.

I guess now the other shoe has dropped.

Tuesday, January 10, 2006

Apple\Intel vs AMD\Microsoft?

Is this yet another knock down drag out fight in the PC space? Time will tell as Apple intros new Core Duo PowerBooks. Teh new boxes use the latest Intel mobile chip except it's now packing double the cores and doesn't quite form a nuclear heater. Of course Intel had to "slow their roll" 50% with the fastest Prescott clocking at 3.8GHz and the new Yonah scaling a "measly" 2.0GHz.
On the other side of the coin, the software and hardware seem to switch places with XP barely holding a candle to Tiger, though the mainstream user is now moving more towards the "cooler, faster" AMD as evidenced by the numerous times the "minus-Dell" sales of retail AMD outdistanced Intel's P4 monopoly.
So the stage is set. Will Apple finally get 20% of the market? WIll Dell finally listen to their customers and offer Opteron\Athlon systems? WHo knows but th ething that is certain is that Intel still has a ways to go before they threaten the superiority of mighty Opteron and its FX\X2 brethren.



Apple's move definitely surprised everyone although one can wonder why they jumped into the frying pan when the counter was just as close. Motorola's latest dual core attempt needs a refrigeration unit to keep it from heat treating the the molded plastic in the IMac and friends, but Inrel is doing slightly better as even their .65 process has merely allowed them to say AMD is not better at load than Intel is at idle. The HP 585 owns the database space (TPC-H), the FX57 owns the gaming space, the Turion is making inroads and can only get better. Is this another case of Intel paying for the privilege or does Steve actually believe that AM2 will not give AMD the performance crown back in the 5% of cases where the new Intel 955 has captured it?
If I knew that, maybe more people would pay attention. AT any rate it iwll be interesting to see how competitive OS X is now with the latest MS offerings, such as X64 Pro. If Apple hopes to gain market share with this initiative, more than "fan-boys" had better be impressed with the performance, especially in cases where there is no native binary for the application. That shouldn't be a problem with Intel's deep pockets, buttiming is exceedingly important since .65 should take AMD up to at least 3.6GHz even with the current SanDiego\Venice architecture and should happen by Q3, just in time for Vista(coincidence?).
When all is said and done, this move by Apple will benefit customers while Intel will see very little additional income intiially, with Apple barely accounting for 10% of the world market. Even if Apple manages to grow market share by 50% or more, it will do little to Microsoft's domination. Even Apple's OpenGL runs better on AMD chips, so it begs the question, What market segment is Apple hoping to gain? PhotoShop is not that big a seller.

Thursday, January 05, 2006

Google vs. Microsoft vs. Microsoft

Yes, it's the battle of the behemoths as Google and Microsoft and Microsoft go at it. You may wonder why Microsoft was mentioned twice and the reason is that right now MS seems to be it's own worst enemy.
I really don't understand why Microsoft thinks they have to compete with Google though. They are in two totally different market segments. As someone recently posted on MiniMicrosoft, this competiton is like the electric company competing with Starbuck's. Microsoft shoul dbe happy to provide the infrastructure for companies like Google and Netscape. All the competition seems to be doing is causing lawsuits.
Is Microsoft incapable of "live and let live?" Obviously not with the amoutn of negativity put forth by Microsoft's own employees. Some people think that the ride is over though and all that Mini got was notoriety. With over 150 comments in the Google\Microsoft post, I think it's obvious that MS is not anyone's favorite anymore. It's a shame too because MS has some brilliant developers and enough cash to do anything they want. Unfortunately, they choose to attack anyone else who is making money.
You would think that they would have learned from the near debacle that was DOJ v. MS and decided to play nice. It's not like ANYONE will catch up with them in the consumer OS space and since MS gets paid even when Linux is installed, there will never be an OS that can unseat the MONEY MACHINE known as Windows\Office. Even if someone creates a "perfect OS" it still HAS to support DirectX for games and the current software base. IMPOSSIBLE.

But anyway I've ranted enough. Check out MiniMicrosoft for more Google vs. Microsoft vs. Microsoft.

Workable Confusion Pt.2

We rejoin our beleaguered monopoly as it attempts to head off it's first threat to the IE Monster since the destruction of Netscape. Yes we're talking about FireFox, "the little engine that could..." actually get noticeable share from IE. The author is a FF user and even though I have to reboot my machine every few days to kill a memory leak, IE is just no fun anymore. ActiveX plugins are a pain and buggy, you can't change themes, there are no extensions, etc.

MS really dropped the ball when thy stopped innovating in the browser space. At this point, I don't think IE7 or Vista IE7 will stop the bleeding. It's just a shame that FF can't be for sale(?), they would actually make money off of it. But then I guess MS made sure that a browser can NEVER be sold again in large quantities. And it only cost a couple of billion dollars, what a deal. (Wait, I thought competition was good!) Anyway, with FF breathing market share fire and the latest releases not being received with open arms, the troubles are not yet done.

Throughout all of this, the 64 bit ball has nearly been dropped as my spanking new x2 4400+ has very few drivers, no IE plugins, one supported MP3 player. Maybe th eproblem is that the lord of the manor keeps changing the driver model and devs are struggling to keep up. And there is yet another change coming for Web Services, graphics with Vista. Unfortunately, there is nothing like the buzz about Win95. Even folks like Paul Thurrott are not exactly sining it's praises. Sure there are some nice new features, but are these features going to be worth a full upgrade? Time will tell, but one thing for certain is that MS will stillmake $1B EVERY month.

Aaahh, the beauty of the "Greatest Deal Ever."

Wednesday, January 04, 2006

Workable Confusion

Intersting title for an even more interesting phenomenon. What does this describe you ask? Well, the coverage ranges at this point from Sexual Identity In America to American Business After The Bubble Burst. Maybe your opinion will call Workable Confusion what it's definition today is: (trumpets please)
Microsoft After XP. Yes kiddies this is another Microsoft post. Of course as an ex-"Softie" I know first hand how good "The Greatest Deal Ever" actually is, if only for what now seems to be a "select few." I can remember the lazy days of 60 almost work hours, constant liquid encouragement, even being entertained by a not quite crossdressing VP and his biker stud. Ahh those were the days. When gays were gays and nobody mentioned it. I can see the grunge crew now, testing their way through at least some of their workload and their doting managers who enjoy the fact that they are smarter than their employees.
Yes, this was before the heady days of the first real competitive threat, good old monopoly holding Netscape which in the course of a few years managed to become the defacto standard for web browsers for Windows. But then, (trumpets again please) the king of the castle decided that the Interent wasn't a fad after all and not only created a browser but bundled it with Windows(?) The end came quick for Netscape as the IE monster snared millions of Dell accounts in it's fourth incarnation(did the first three REALLY suck or what?) The once proud Communicator disappeared faster than a Sony Opteron server and was forced to sellout to AOL just to remain in business.
But what's this, improprieties are found in MS's dealing s and the score is nearly cancelled. Netscape cries foul, we can't compete with free(yes, I spent 39.95 for Navigator). MS stands behind devotion to their customers and the "Freedom To Innovate," but to no avail as one company after and state after another lines up to get their "piece of the pie" totalling up to ten's of billions of dollars and a black eye that may never heal. And if that wasn't enough there was the horrendous "attack from within" exposed by the ruthlessly open IE and Outlook. Company networks ground to a halt, millions of computers had to be restored (hope you kept that backup program running), and email became a long lost accomodation.
For two years this continued, until (not the trumpets again) the lord of the manor declared that there should be more security. The stock went from 120 to 30, consumer confidence opened the door for Linux and a little known web company was well on it's way to becoming a definiton in the dictionary. Through this the beleaguered Windows 2000 had the distinction of being the most secure insecure OS to date, while MS' new eXPerience was being worked on along with the push to 64bit. Even with all of the turmoil, MS was still bouyed by the "Greatest Deal Ever" and continued to make $1B profit EVERY month.
But now, with the screams of bloody murder about the condition of VS 2005 and the lateness of Vista\Longhorn, along comes Google (not really sure how the markets are intersecting) with it's overinflated(?) stock price, deep pockets and no "big-bad-buggy-baggage." MS has a near equal in Google as everything they touch seems to turn to gold. And then to add insult to injury an insider named "who'Da Punk" has turned the inside of MS on its ear with his brash commentary and "Mini-Microsoft" mantra. He has blogged on everything from "no more towels" to the bloated bureaucracy to the unfairness of stack ranking. He has gained a large following and even made it to BusinessWeek.
Oh, woe is Microsoft, attacked from within, embattled without, but oh yeah they still make $1B EVERY month. Time will tell if the "not so revolutionary" Vista will help MS or if it's debut be spoiled by bugs and security holes.

Sounds like Workable Confusion to me.