Yeah. peeps, shit is fucked. So fucked we dropping any use of fucking grammar and shit and vent like a motherfucker. I'm fuckin ODing. I went to the BB King's NYC the other night ( I haven't been out in a while cause shit is so wack) and I decided to play male stripper.
Well, that's what I always do. Chicks be illin with turning down dances unless you come up behind em on the dance floor. Can't do it. Can't even ask at this point.
Anyway, Angie Mar was there hosting and I was hoping my favorite person would show up. I missed her at the other club (FUCKING PIECE OF GODDAMN SHIT FUCKING CRAP ASS. SHIT).
You might say how does showing women you're a slut work but you didn't see women taking pictures with me while I did "freaky dancing" to their friends. I be having chicks jumping on my dick like that and I dress like a GQ cover, NOT A THUG, so obviously you THUGS is wack little bitches that's scared of male sexuality.
But then Sliding Down the Pole?????? 100 dollar bills comin out her asshole??? Ever heard of the dictionary? I hear some black people came up with some of those words.
Anyway, I might start going again, but I don't have time to convince women I don't want to see em kiss if they got hot girlfriends. I would like to hit all of em off though. I almost had that shit worked out at Cheetah until the haters and the dykes got mad at me. (How the fuck is niggas supporting dykes - they in the closet or what?) And last night I saw these bad chicks after wards so I tried to get as many as I could. I was getting close to the leader (there always is one) and this hating ass photographer almost got his camera shoved up his ass.
It was six of em so he could have got in on it, but didn't get nothing. I was trying to get em in a cab and he made em split up. I hate bitch boys. You'd think the THUG CAPITAL OF THE WORLD would have some brothas wit some balls but no. I guess they all dropped off because of the pants.
Then most of the shit that's dropping can't get the club jumping at all. You got songs about strippers while the DJ is asking who got a college degree. It's something wrong with that.
But anyway I had fun shutting everything down though. Funk Flex was there and he was around for some of my crazy club shit a couple years ago.
The funniest shit though is that I'm the oldest person in the club and I have a job as a software developer but I'M STILL THE HOTTEST THING TO HIT NYC FUCKIN EVER.
Do you know what it's like to know that when you go there will really be no game left in hip hop, just a bunch of FUCKING BUMS trying to tat up the most beautiful chicks in the world. And it's cause the white man won't let em do anything else.....unless they do what I do or skip to the Presidency.
GQ MOTHERFUCKER!!!!!
SUAVE MOTHERFUCKER!!!!
I'll still get mine. Maybe I'll get around to finishing my new hip hop album. MY first one was killing SoundClick.com.
I think I'm gonna do some YouTube videos dedicated to the Twerk Team and snatch up chicks nationwide. They can't resist that WINDING. And I'm even better now.
PEACE AND FUCK YOU!!!
Sunday, July 05, 2009
Monday, June 29, 2009
Wednesday, December 10, 2008
VASIMR and Me
No that isn't a crazy movie title. It's a vindication of some of the hard work I've done. Years ago I was working on propulsion technologies and after designing a vacuum chamber designed which could actually produce as much thrust (theoretically) as a commercial airline engine, I ventured to outer space. Of course with no air a vacuum chamber would have no use. I decided initially to setup a complex Emag field to draw air in from the atmosphere. Great but extended travel not so great.
From there I decided to look at creating hydrogen ions in a high energy field (hydrogen is a proton and a neutron). Initialy I looked at a "static generation field" that would process energy to create particulate matter. This was rather inefficient so I turned to nuclear energy.
Theoretically the mechanism is question would "proton charge" radioactive elements such that neutrons are knocked off one at a time after being stripped of the electrons. This had an additional side effect of generating usable energy which could be accumulated in a separate circuit.
Then these particles are exposed to a high-energy magnetic field which directs them through two chambers which increase the heat to create a plasma. Upon release the particles are then at several 1000 degrees and simulate the thrust of combusted gas.
Recently a company called Ad Astra has designed a a similar system using contained hydrogen gas. It is testing by NASA now and should be in service within a few years.
I guess next someone will figure out my teleportation algorithm.
From there I decided to look at creating hydrogen ions in a high energy field (hydrogen is a proton and a neutron). Initialy I looked at a "static generation field" that would process energy to create particulate matter. This was rather inefficient so I turned to nuclear energy.
Theoretically the mechanism is question would "proton charge" radioactive elements such that neutrons are knocked off one at a time after being stripped of the electrons. This had an additional side effect of generating usable energy which could be accumulated in a separate circuit.
Then these particles are exposed to a high-energy magnetic field which directs them through two chambers which increase the heat to create a plasma. Upon release the particles are then at several 1000 degrees and simulate the thrust of combusted gas.
Recently a company called Ad Astra has designed a a similar system using contained hydrogen gas. It is testing by NASA now and should be in service within a few years.
I guess next someone will figure out my teleportation algorithm.
Thursday, December 04, 2008
The President-Elect
Howdy folks,
After a long hiatus Super Genius Guy is back. Back with a new post a new universal paradigm. As is customary here we don't pull punches and this post will be no different.
We're here today to talk about the future. The new future of the world. Yes, something that I knew could happen (I thought about it but am too much of a lush to try...yet).
The United States of America has elected a "brother" President. Yes, you all heard that right. Many movies have come to fruition and a black man is President.
Introducing Barack Hussein Obama or Barry as he is sometimes known. He ran a near-perfect race and did me proud. Now there is a new wind blowing. No more can ANYONE say that America can't see the content of a man's character. People picked him out of a very tough field from both the Democratic and Republican parties.
Admittedly I voted for Hillary but I have a powerful woman fetish so it wasn't a knock just my choice. I did vote for him over "The Old Guy and the Temptress" but when Biden did his little move at the end of the acceptance speech, pointing and crouching to someone in the crowd, I knew we had a serious ticket that did connect with the middle class.
I must have listened to his speech ten times as I saw none of the head movements and pre-choreographed gestures that are staple of American national politics. I would actually like to help him, mainly because our race is in trouble. We need to erase the last ten years of thugs and illiterate athletes.
It does seem interestingly coincidental that the aforementioned seem to continually catch cases for real "ghetto" shit. The worse of which just this week involving an accidental shooting with an illegal firearm. (I'm trying not to laugh but because I wrote a Web Site for NYC 311 Agencies, here's a book Bloomberg can throw at him)
I for one am going to take full advantage of the Future Barack Made and make sure his daughters do get the same opportunities as ANYONE'S sons. As a matter of fact I have my Film Maker's eye on a very special actress who WILL get to be in my movies. She WILL keep her clothes on while doing it. She hasn't removed them yet so...
All you thugs (read:tattoed CLOWNS) out there trying to make hoochies out of MY beautiful black women, you got a fight coming. Well actually, I've BEEN holding it down like that. I had the number 8, number 22 and number 30 hip hop tracks on SoundClick when I dropped my first album in 2003. I turned out every club from NYC to Seattle, and had my first script get me a consider as a writer from a Production Company that worked with Val Kilmer.
And you may say what does that have to do with our new President? EVERYTHING. There is a glimmer of hope for every young black person out there as they can forget about the GARBAGE underachievers tell them about their prospects in America. They have the GREATEST ROLE MODEL POSSIBLE. I repeat THE GREATEST ROLE MODEL POSSIBLE. The President of the United States.
Sir, I salute you. You've done your country proud.
After a long hiatus Super Genius Guy is back. Back with a new post a new universal paradigm. As is customary here we don't pull punches and this post will be no different.
We're here today to talk about the future. The new future of the world. Yes, something that I knew could happen (I thought about it but am too much of a lush to try...yet).
The United States of America has elected a "brother" President. Yes, you all heard that right. Many movies have come to fruition and a black man is President.
Introducing Barack Hussein Obama or Barry as he is sometimes known. He ran a near-perfect race and did me proud. Now there is a new wind blowing. No more can ANYONE say that America can't see the content of a man's character. People picked him out of a very tough field from both the Democratic and Republican parties.
Admittedly I voted for Hillary but I have a powerful woman fetish so it wasn't a knock just my choice. I did vote for him over "The Old Guy and the Temptress" but when Biden did his little move at the end of the acceptance speech, pointing and crouching to someone in the crowd, I knew we had a serious ticket that did connect with the middle class.
I must have listened to his speech ten times as I saw none of the head movements and pre-choreographed gestures that are staple of American national politics. I would actually like to help him, mainly because our race is in trouble. We need to erase the last ten years of thugs and illiterate athletes.
It does seem interestingly coincidental that the aforementioned seem to continually catch cases for real "ghetto" shit. The worse of which just this week involving an accidental shooting with an illegal firearm. (I'm trying not to laugh but because I wrote a Web Site for NYC 311 Agencies, here's a book Bloomberg can throw at him)
I for one am going to take full advantage of the Future Barack Made and make sure his daughters do get the same opportunities as ANYONE'S sons. As a matter of fact I have my Film Maker's eye on a very special actress who WILL get to be in my movies. She WILL keep her clothes on while doing it. She hasn't removed them yet so...
All you thugs (read:tattoed CLOWNS) out there trying to make hoochies out of MY beautiful black women, you got a fight coming. Well actually, I've BEEN holding it down like that. I had the number 8, number 22 and number 30 hip hop tracks on SoundClick when I dropped my first album in 2003. I turned out every club from NYC to Seattle, and had my first script get me a consider as a writer from a Production Company that worked with Val Kilmer.
And you may say what does that have to do with our new President? EVERYTHING. There is a glimmer of hope for every young black person out there as they can forget about the GARBAGE underachievers tell them about their prospects in America. They have the GREATEST ROLE MODEL POSSIBLE. I repeat THE GREATEST ROLE MODEL POSSIBLE. The President of the United States.
Sir, I salute you. You've done your country proud.
Sunday, July 02, 2006
Advanced Wire Work with Motor Curve Generation
Hi folks,
It's been awhile and I know I’ve kept the four of you waiting but hey the early bird catches the worm and doesn’t always have time to update the blog. Today’s post is something new, as the title states we’re going to talk about wire work in movies.
I love advanced special FX and this is a topic I’ve long studied especially in the advent of the Americanization of Chinese martial arts movies, such as; “Crouching Tiger,” “Kung Fu Hustle” etc.
Watching these movies is more a hobby than entertainment for me because I have to figure out what I see. Most of today’s movies use a mix of blue screen and wirework to simulate “additional environmental collisions” and the like….. I digress, back to the topic.
Most movies have trouble with one particular thing in the use of wires for fights and climbing, etc.
There needs to be a motor curve worked out for each individual or an active lookup algorithm based on various physical maneuvers, so that differences in “active feedback” on the cable erase the appearance of “floating.”
I’m thinking about moving this kind of thing to a website because I hate not having active image references on Blogger – I have to link to another site – maybe one of you three can tell me how to access images with “root HTML.”
I’m working on a new website and I may try to squeeze in some features that allow for image storage.
This technique is slightly harder to implement though because it requires a “track” to run the engine on, but two cranes with two pulleys connected to either end of an IBeam, tracks can be made for “linear” action – meaning that a secondary beam structure would need to be added to allow for perpendicular movement.
But by adding a second “cam” to the X Axis mechanism it would then be possible to add a third “rotational axis” to allow characters to move farther than contact where contact is the zero point in relative coordinates.
I guess studies like these led to advances in “blue screen positioning of CGI objects” but that would make actors into “voices.”
Don’t know if that would be a bad thing.
Anyway, such a device could be scaled to fit small spaces by using an I beam framework instead of relying on cranes for support, though cranes would help with “aerial maneuvers” like those in the last Matrix.
Blue screening using this technique would definitely have helped with the SpiderMan 2 climax on the subway. I was left flat by their mixture of live action and CGI.
I need to get a good 3D app like MicroStation but that’s a bit much. Guess I’ll have to cough up $600 for one of the lower priced ones – too bad I can’t get the student rate anymore.
As I’m writing this I seem to have “visualized” a more compact version that would enable large groups in one scene on a coordinated group of “motor wire assemblies,” hey that could be patented – anyone got a couple grand burning a whole in their pocket?
The basis of the mechanism is that you have to do tests of “height displacement” using a multi-axis solid bar that registers these differences. Certain movements would use a single bar (only height changes or rotation around the perpendicular axis of connection) and certain movements would use two bars connected for the parallel axis range.
The second mechanism allows for additional points of connection along the body for flight scenes. Since this condition allows – necessitates – more connection points motors can be made smaller and connected in pairs. With a second level of resistance it would be possible to “balance the “shoulder and ankle” connections and move the ankle connection closer to the body and the shoulder connection farther to enable close quarters between two “flyers,” while enabling the same type of “free-movement”
Anyway, that’s the gist of today’s topic. Hopefully, someone will see it…..
It's been awhile and I know I’ve kept the four of you waiting but hey the early bird catches the worm and doesn’t always have time to update the blog. Today’s post is something new, as the title states we’re going to talk about wire work in movies.
I love advanced special FX and this is a topic I’ve long studied especially in the advent of the Americanization of Chinese martial arts movies, such as; “Crouching Tiger,” “Kung Fu Hustle” etc.
Watching these movies is more a hobby than entertainment for me because I have to figure out what I see. Most of today’s movies use a mix of blue screen and wirework to simulate “additional environmental collisions” and the like….. I digress, back to the topic.
Most movies have trouble with one particular thing in the use of wires for fights and climbing, etc.
There needs to be a motor curve worked out for each individual or an active lookup algorithm based on various physical maneuvers, so that differences in “active feedback” on the cable erase the appearance of “floating.”
I’m thinking about moving this kind of thing to a website because I hate not having active image references on Blogger – I have to link to another site – maybe one of you three can tell me how to access images with “root HTML.”
I’m working on a new website and I may try to squeeze in some features that allow for image storage.
This technique is slightly harder to implement though because it requires a “track” to run the engine on, but two cranes with two pulleys connected to either end of an IBeam, tracks can be made for “linear” action – meaning that a secondary beam structure would need to be added to allow for perpendicular movement.
But by adding a second “cam” to the X Axis mechanism it would then be possible to add a third “rotational axis” to allow characters to move farther than contact where contact is the zero point in relative coordinates.
I guess studies like these led to advances in “blue screen positioning of CGI objects” but that would make actors into “voices.”
Don’t know if that would be a bad thing.
Anyway, such a device could be scaled to fit small spaces by using an I beam framework instead of relying on cranes for support, though cranes would help with “aerial maneuvers” like those in the last Matrix.
Blue screening using this technique would definitely have helped with the SpiderMan 2 climax on the subway. I was left flat by their mixture of live action and CGI.
I need to get a good 3D app like MicroStation but that’s a bit much. Guess I’ll have to cough up $600 for one of the lower priced ones – too bad I can’t get the student rate anymore.
As I’m writing this I seem to have “visualized” a more compact version that would enable large groups in one scene on a coordinated group of “motor wire assemblies,” hey that could be patented – anyone got a couple grand burning a whole in their pocket?
The basis of the mechanism is that you have to do tests of “height displacement” using a multi-axis solid bar that registers these differences. Certain movements would use a single bar (only height changes or rotation around the perpendicular axis of connection) and certain movements would use two bars connected for the parallel axis range.
The second mechanism allows for additional points of connection along the body for flight scenes. Since this condition allows – necessitates – more connection points motors can be made smaller and connected in pairs. With a second level of resistance it would be possible to “balance the “shoulder and ankle” connections and move the ankle connection closer to the body and the shoulder connection farther to enable close quarters between two “flyers,” while enabling the same type of “free-movement”
Anyway, that’s the gist of today’s topic. Hopefully, someone will see it…..
Sunday, March 26, 2006
X64 definitely DOA and beyond......
I have to rant right now at every dev house and driver maker and plug-in manufacturer in existence, not to mention the KING OF ALL FUCK-UPS MacroShaft. I waited a LONNGGG time before I made the jump with my brand new AMD X2 4400+ with 2GB RAM - NUMA on my mind - in Oct2005.
I should have waited until hell froze over. From IE32 crashing every 10 minutes AND TAKING DOWN EXPLORER, IE64 having no plugins and toolbars all the way TO MS THEMSELVES NOT HAVE SPECIAL FEATURES ON THEIR $100 Wireles Desktop, At least Logitech did, it was 5 months of a nightmare like being back in Redmond - shudders horribly and that was enough except that my dev efforts at home became severely hampered by Virtual Server R2 not working on X64 Pro, as advertised, which meant I couldn't test Client\Server SOA. Damnit Damnit Damnit.
Others are feeling the pain also as the "push to mainstream 64 bit computing" is, how should I say it, faltering even now. I have always said - after I left of course since there was no need in talking while an employee - that MS should be paying out grants for small comapnies to get things done and maybe get a penny or two per client. After all, you can't get more than 90% of the market. MS is and should be a solid stable release every year or so with new APIs and faster kernel methods with more abstraction.
This is all over the news as Vista Home - or LongerHorn as it's been called - has been yet again delayed for the consumer space. As a Windows workhorse in the time through "The Fall" I can say that it is through a lack of adequate testing of consumer space products like XP Home, I have no doubt that a lot of the functionality needs to be worked on to overcome issues that should have been caught much earlier in the cycle. I mean I can't remember seeing a "Home Network" setup anywhere with XP Home. Even things like System Restore weren't tested with Home - well hey I tried....
And through the worst of Vista's woes, the guys in charge decide that automation is more important than and even supercedes the tied and true methods of Dev\SDET\STE in equal measures dependent on component complexity. What a mess that thing will be in the hands of millions without it. And I don't think this Too little, too late change will do that much to make up for the manhours spent on actually fully implemting Aero into Explorer.....duhhhh, sounds familar. Hopefully the reported level of lockdown will prevent the "I Love You" times of the late 90's and actually run well on the 100s of 1000s of Dell machines that will be sold with it - hey, Dell barely pays for anything, but maybe they should start demanding REAL GRAPHICS from Intel.
As I have toally digressed from my rant about X64 I can say that I am typing this with XP SP2 wondering if I can get my $150 back for MS' attempt at moving to 64 bit. The good news is that my FireFox problems have been solved also, because I switched back in time for IE 7 Beta 2 which is pretty good at not keeping 100s of MBs of RAM while I cruise through "tab threads." Of course that's because IE 7 doesn't have an X64 beta. I'm even able to run other services in the same amount of space as the redundant 32\64 services on X64 with WoW - my systray has 15 items.
Did I say - in my best immigrant voice - Son of the bitch.
I should have waited until hell froze over. From IE32 crashing every 10 minutes AND TAKING DOWN EXPLORER, IE64 having no plugins and toolbars all the way TO MS THEMSELVES NOT HAVE SPECIAL FEATURES ON THEIR $100 Wireles Desktop, At least Logitech did, it was 5 months of a nightmare like being back in Redmond - shudders horribly and that was enough except that my dev efforts at home became severely hampered by Virtual Server R2 not working on X64 Pro, as advertised, which meant I couldn't test Client\Server SOA. Damnit Damnit Damnit.
Others are feeling the pain also as the "push to mainstream 64 bit computing" is, how should I say it, faltering even now. I have always said - after I left of course since there was no need in talking while an employee - that MS should be paying out grants for small comapnies to get things done and maybe get a penny or two per client. After all, you can't get more than 90% of the market. MS is and should be a solid stable release every year or so with new APIs and faster kernel methods with more abstraction.
This is all over the news as Vista Home - or LongerHorn as it's been called - has been yet again delayed for the consumer space. As a Windows workhorse in the time through "The Fall" I can say that it is through a lack of adequate testing of consumer space products like XP Home, I have no doubt that a lot of the functionality needs to be worked on to overcome issues that should have been caught much earlier in the cycle. I mean I can't remember seeing a "Home Network" setup anywhere with XP Home. Even things like System Restore weren't tested with Home - well hey I tried....
And through the worst of Vista's woes, the guys in charge decide that automation is more important than and even supercedes the tied and true methods of Dev\SDET\STE in equal measures dependent on component complexity. What a mess that thing will be in the hands of millions without it. And I don't think this Too little, too late change will do that much to make up for the manhours spent on actually fully implemting Aero into Explorer.....duhhhh, sounds familar. Hopefully the reported level of lockdown will prevent the "I Love You" times of the late 90's and actually run well on the 100s of 1000s of Dell machines that will be sold with it - hey, Dell barely pays for anything, but maybe they should start demanding REAL GRAPHICS from Intel.
As I have toally digressed from my rant about X64 I can say that I am typing this with XP SP2 wondering if I can get my $150 back for MS' attempt at moving to 64 bit. The good news is that my FireFox problems have been solved also, because I switched back in time for IE 7 Beta 2 which is pretty good at not keeping 100s of MBs of RAM while I cruise through "tab threads." Of course that's because IE 7 doesn't have an X64 beta. I'm even able to run other services in the same amount of space as the redundant 32\64 services on X64 with WoW - my systray has 15 items.
Did I say - in my best immigrant voice - Son of the bitch.
Sunday, March 19, 2006
Data Modeling In Windows
Howdy boys and girls, it's time for some code stuff now. This is an article - without the graphics - that I posted on MSD2D.COM a few months ago. Enjoy.
.Net Framework Series 2.0
Data Modeling in Windows
By Christian Howell
Data modeling: the words bring chills to some and visions of 14 hour days to others but with the tools available today such as UML (Unified Modeling Language) XML (eXtensible Markup Language) and CIM (Common Information Model) modeling your data becomes a matter of taking the customer requirements and matching class structures and interfaces to the data types that are necessary. Unlike the old days where a brute force model would work today’s software needs a more structured approach. With the world a few processor generations from “the gang of four” and Managed .Net as the “Center of the Windows Universe” abstracted components are the new keyword for flexible, extensible and secure code.
Properly modeling a consistent UI\Program flow is in and of itself an “evolutionary process” so the term data modeling even means different things to different people. This makes standardizing modeling methodologies even more difficult. There is also the difference between modeling an existing feature set in a new way and modeling a feature set designed from scratch. In this text the term means “developing features such that they can be accessed from multiple sources, from the Native UI to Collaborative services to testing harnesses.” The old paradigm was to gather what needed to be done and just write functions that did it and perhaps handing off UI duties to someone else. Today with feature sets and customer requirements for collaboration and interoperability growing exponentially an object oriented approach is needed to not only limit the amount of code necessary to implement feature sets and make them accessible between app domains but to make the UI easy to use and update.
.Net was designed with these issues in mind and does an excellent job of abstracting data objects and unifying Windows programs under a memory-managed platform; especially in the 2.0 version of the Framework. Of course it can’t handle every case without extension so an effort is needed so that complex objects such as Network Streams become much smaller sub-objects. Such a model might include; port, machine name\IP, permissions, headers, data streams so rather than trying to determine all of the ways you can use a Network Stream you can create XML Schema based scripts to combine the different sub-objects into platform or application specific descriptions. For example, only one feature needs to access the port and machine name while another can process the permissions. Another feature would then decode the headers and stream to determine further processing requirements. Another feature could be used for encryption of returning data streams to add an extra layer of protection. Depending upon the data, security and speed\concurrency needs any one of the accepted patterns, such as State or Strategy can be used to extend the initial program flow. .Net provides native encryption and compression algorithms for text streams and binary streams through the BinaryFormatter so custom strategies are rarely needed for those services.
In this model the 3rd party external client (EC) can then be any module on any machine in a domain or even on the Internet. The server is contacted by the client with a port number and machine name. Since Managed code enables programmers to use declarative security and Windows authentication much of the security overhead can be encapsulated in the calling thread of the client app domain. This abstraction also means that the client has to have access to the pre-compiled server code. The client can be extended to contain the interface for any 3rd party clients that need to have access to the server code. The server code is the middle tier of the abstraction and is needed by any 3rd party client. This also allows for a client\server interface between the entire Network Stream object as described above and any 3rd party tools. As long as the public portion of the request\response feature of the client remains consistent it is possible for the 3rd party client to extend to do more internal processing without requiring new feature requests.
Because of the encapsulation in the server feature permissions need to be correctly applied to the object space before the connection is even attempted. This type of abstraction also allows each property of the sub-objects to be independent of the others so you only need create one Permission object, one Validation object, and one Connection object for the application space. Since the data stream can be any type of .Net stream, this model allows the developer to use one object (class) for most data types since they can be copied to a stream with customizable headers. By creating a header template lookup, several different complex object types can be returned and decoded by the internal client response feature. The 3rd party client is then totally separated from the internal logic of the model. Only the public features in the internal client are exposed and since the data types are known by the 3rd party developer the objects can be extended for application or platform specific needs. This is especially useful for the ever-increasing amount of internet applications. By combining header templates with overloaded method types, all types of database info can be encapsulated between the server and connection spaces while also allowing for local\remote file access between app domains and physical networks.
With the .Net paradigm each of the sub-objects in the model becomes a class under the same namespace (DataAccess). Encapsulation allows that each of these classes can contain smaller objects which handle a part of the processing. This layered (n-tiered) approach means that different clients can access different parts of the model without having access to any other. That is the function of the request feature within the client (DataAccess.Client) space. It sits between the server (DataAccess.Server) and the 3rd party interface. By merely providing multiple overloads for request types, it is possible to control access to any data stream through any connection. The request feature also works in conjunction with the response feature to encode and decode as necessary while verifying thread identities for large numbers of concurrent users.
The server feature formats and forwards requests to the connection feature after validating input parameters from the request feature. This allows that the client request feature has no access to permissions, meaning that the server is isolated from any 3rd party requests. Since all of the methods in the server feature are internal, all calls to the connection feature must be routed first through the public request feature and then be approved by the server. By using a strong name object for each request, high levels of concurrency can be achieved while maintaining data integrity for each request. The server uses a queue to manage requests and responses. This queue contains request-specific information for thread coordination with the client request\response feature.
The connection feature is the final segment of code and is responsible for processing the headers in the request and retrieving the data stream from storage or creating a new entry. The data can then be encrypted for return. By defining your requests with text scripts it is then possible to have requests come from multiple sources including the Internet for easy transfer. In this feature the emphasis is placed on speed rather than security. This allows optimization of this module without affecting the security of the response feature. This feature handles any external storage interfaces such as SQL databases or XML files by simply overloading access methods based on the header processing. This model can be easily extended or adapted to handle different types of application models. This feature is the most complicated since it has to be coordinated with the design of the storage medium. In the case of databases the developer needs to work well with queries and stored procedures while a file system access application needs to handle NTFS well and some apps need to deal with both while handling transaction concurrency.
The key to this type of model is that “most” usable patterns have already been discovered and can be extended as necessary. Most of these patterns are based on the common File, View Edit, Tools, Options, Help environment (The standard Windows Menu\UI paradigm). Of course, it is never a good idea to try and write initially to a pattern, since the differences in application features and requirements mean that in one case a State pattern may be more efficient than a Strategy or Factory pattern for two apps that perform similar functions. When modeling data for consumption and display the key is to remember that any data can be described using a combination of native .Net types and that the description of the data is ALWAYS more important than the features that use it. In many cases personal or financial information is consumed and must be protected by the interface. By ensuring first and foremost that the data remains consistent throughout the process refactoring will then be useful for optimization. The feature set will then expand as testing of current features continues. This is known as an evolutionary design cycle. It means in essence that you should always keep your code simple and always design your features with testing in mind. Some people consider this method to be “designing to the interface and not the implementation.” Another way of saying this is the user doesn’t need to know the details only the data. For any object space overloading the public entry points enables different types and amounts of data to be processed by the same internal server. By keeping with the abstracted component methodology, you will avoid creating complex methods that don’t allow for high levels of granularity with your object space.
Tools such as NUnit (www.nunit.org) give developers a way to test their features individually or as a live client. Script languages based on XML schemas or UML are much more efficient because they have no code overhead. The same parser that is used for the client scripts can also be extended to include test parameters and environment settings. With tools such as NUnit you need to adhere to the format set aside which sometimes causes increases in the amount of code necessary to determine success of a given test case. It is of course possible to plan for using these types of tools through a script\parser interface but again the idea of modeling is to limit the amount of code you have to write and maintain. Component-based scripting does this and more. It enables “cut and paste” editing, ease of storage, no need to recompile to add new requests. Of course adding features for processing the data in scripts requires new code and schema elements but this type of model means that new features are separated from existing features and lessen the chance of regression failures. This abstraction also enables you to make tools based on a subset of features; such as, setting up initial environments, creating database tables, create web pages using XML\XSLT, or viewing XML documents. All that is needed is a custom client space. Below are listed the basic data objects necessary in each object space of the model. These are determined by either writing out a paragraph or two describing the necessary functionality or the data that needs to be exchanged. The .Net Library 2.0 contains advances in C# such as anonymous methods, which allow “inline” delegates; iterators, which add the “yield return” and “yield break” methods to reduce amount of code necessary for base collections; nullable types, which allows value types to assign null to the type instance; generics, which allow templated base classes for collections of any .Net object type. Look for coverage of these new features, coming soon.
Client Data Objects
HttpWebRequest
HttpWebResponse
WebRequest
WebResponse
XmlDocIn
XmlDocOut
EncryptionKey
EncryptAlgorithm
RequestQueue
MemoryStream
SecurityPrincipal
ThreadPrincipal
RequestType – complex
Server Data Objects
PortNumber
IPAddress
IOPermissions - ACL
WebPermissions - SSL
XMLParser
HeaderLookup
ValidationRegularExpressions
WebService
Connection Data Objects
HeaderBlock
AccessPermissions – ACL\Thread
EncryptionAgorithm
EncryptionKey
NetworkStream
FileStream
XMLFactory
.Net Framework Series 2.0
Data Modeling in Windows
By Christian Howell
Data modeling: the words bring chills to some and visions of 14 hour days to others but with the tools available today such as UML (Unified Modeling Language) XML (eXtensible Markup Language) and CIM (Common Information Model) modeling your data becomes a matter of taking the customer requirements and matching class structures and interfaces to the data types that are necessary. Unlike the old days where a brute force model would work today’s software needs a more structured approach. With the world a few processor generations from “the gang of four” and Managed .Net as the “Center of the Windows Universe” abstracted components are the new keyword for flexible, extensible and secure code.
Properly modeling a consistent UI\Program flow is in and of itself an “evolutionary process” so the term data modeling even means different things to different people. This makes standardizing modeling methodologies even more difficult. There is also the difference between modeling an existing feature set in a new way and modeling a feature set designed from scratch. In this text the term means “developing features such that they can be accessed from multiple sources, from the Native UI to Collaborative services to testing harnesses.” The old paradigm was to gather what needed to be done and just write functions that did it and perhaps handing off UI duties to someone else. Today with feature sets and customer requirements for collaboration and interoperability growing exponentially an object oriented approach is needed to not only limit the amount of code necessary to implement feature sets and make them accessible between app domains but to make the UI easy to use and update.
.Net was designed with these issues in mind and does an excellent job of abstracting data objects and unifying Windows programs under a memory-managed platform; especially in the 2.0 version of the Framework. Of course it can’t handle every case without extension so an effort is needed so that complex objects such as Network Streams become much smaller sub-objects. Such a model might include; port, machine name\IP, permissions, headers, data streams so rather than trying to determine all of the ways you can use a Network Stream you can create XML Schema based scripts to combine the different sub-objects into platform or application specific descriptions. For example, only one feature needs to access the port and machine name while another can process the permissions. Another feature would then decode the headers and stream to determine further processing requirements. Another feature could be used for encryption of returning data streams to add an extra layer of protection. Depending upon the data, security and speed\concurrency needs any one of the accepted patterns, such as State or Strategy can be used to extend the initial program flow. .Net provides native encryption and compression algorithms for text streams and binary streams through the BinaryFormatter so custom strategies are rarely needed for those services.
In this model the 3rd party external client (EC) can then be any module on any machine in a domain or even on the Internet. The server is contacted by the client with a port number and machine name. Since Managed code enables programmers to use declarative security and Windows authentication much of the security overhead can be encapsulated in the calling thread of the client app domain. This abstraction also means that the client has to have access to the pre-compiled server code. The client can be extended to contain the interface for any 3rd party clients that need to have access to the server code. The server code is the middle tier of the abstraction and is needed by any 3rd party client. This also allows for a client\server interface between the entire Network Stream object as described above and any 3rd party tools. As long as the public portion of the request\response feature of the client remains consistent it is possible for the 3rd party client to extend to do more internal processing without requiring new feature requests.
Because of the encapsulation in the server feature permissions need to be correctly applied to the object space before the connection is even attempted. This type of abstraction also allows each property of the sub-objects to be independent of the others so you only need create one Permission object, one Validation object, and one Connection object for the application space. Since the data stream can be any type of .Net stream, this model allows the developer to use one object (class) for most data types since they can be copied to a stream with customizable headers. By creating a header template lookup, several different complex object types can be returned and decoded by the internal client response feature. The 3rd party client is then totally separated from the internal logic of the model. Only the public features in the internal client are exposed and since the data types are known by the 3rd party developer the objects can be extended for application or platform specific needs. This is especially useful for the ever-increasing amount of internet applications. By combining header templates with overloaded method types, all types of database info can be encapsulated between the server and connection spaces while also allowing for local\remote file access between app domains and physical networks.
With the .Net paradigm each of the sub-objects in the model becomes a class under the same namespace (DataAccess). Encapsulation allows that each of these classes can contain smaller objects which handle a part of the processing. This layered (n-tiered) approach means that different clients can access different parts of the model without having access to any other. That is the function of the request feature within the client (DataAccess.Client) space. It sits between the server (DataAccess.Server) and the 3rd party interface. By merely providing multiple overloads for request types, it is possible to control access to any data stream through any connection. The request feature also works in conjunction with the response feature to encode and decode as necessary while verifying thread identities for large numbers of concurrent users.
The server feature formats and forwards requests to the connection feature after validating input parameters from the request feature. This allows that the client request feature has no access to permissions, meaning that the server is isolated from any 3rd party requests. Since all of the methods in the server feature are internal, all calls to the connection feature must be routed first through the public request feature and then be approved by the server. By using a strong name object for each request, high levels of concurrency can be achieved while maintaining data integrity for each request. The server uses a queue to manage requests and responses. This queue contains request-specific information for thread coordination with the client request\response feature.
The connection feature is the final segment of code and is responsible for processing the headers in the request and retrieving the data stream from storage or creating a new entry. The data can then be encrypted for return. By defining your requests with text scripts it is then possible to have requests come from multiple sources including the Internet for easy transfer. In this feature the emphasis is placed on speed rather than security. This allows optimization of this module without affecting the security of the response feature. This feature handles any external storage interfaces such as SQL databases or XML files by simply overloading access methods based on the header processing. This model can be easily extended or adapted to handle different types of application models. This feature is the most complicated since it has to be coordinated with the design of the storage medium. In the case of databases the developer needs to work well with queries and stored procedures while a file system access application needs to handle NTFS well and some apps need to deal with both while handling transaction concurrency.
The key to this type of model is that “most” usable patterns have already been discovered and can be extended as necessary. Most of these patterns are based on the common File, View Edit, Tools, Options, Help environment (The standard Windows Menu\UI paradigm). Of course, it is never a good idea to try and write initially to a pattern, since the differences in application features and requirements mean that in one case a State pattern may be more efficient than a Strategy or Factory pattern for two apps that perform similar functions. When modeling data for consumption and display the key is to remember that any data can be described using a combination of native .Net types and that the description of the data is ALWAYS more important than the features that use it. In many cases personal or financial information is consumed and must be protected by the interface. By ensuring first and foremost that the data remains consistent throughout the process refactoring will then be useful for optimization. The feature set will then expand as testing of current features continues. This is known as an evolutionary design cycle. It means in essence that you should always keep your code simple and always design your features with testing in mind. Some people consider this method to be “designing to the interface and not the implementation.” Another way of saying this is the user doesn’t need to know the details only the data. For any object space overloading the public entry points enables different types and amounts of data to be processed by the same internal server. By keeping with the abstracted component methodology, you will avoid creating complex methods that don’t allow for high levels of granularity with your object space.
Tools such as NUnit (www.nunit.org) give developers a way to test their features individually or as a live client. Script languages based on XML schemas or UML are much more efficient because they have no code overhead. The same parser that is used for the client scripts can also be extended to include test parameters and environment settings. With tools such as NUnit you need to adhere to the format set aside which sometimes causes increases in the amount of code necessary to determine success of a given test case. It is of course possible to plan for using these types of tools through a script\parser interface but again the idea of modeling is to limit the amount of code you have to write and maintain. Component-based scripting does this and more. It enables “cut and paste” editing, ease of storage, no need to recompile to add new requests. Of course adding features for processing the data in scripts requires new code and schema elements but this type of model means that new features are separated from existing features and lessen the chance of regression failures. This abstraction also enables you to make tools based on a subset of features; such as, setting up initial environments, creating database tables, create web pages using XML\XSLT, or viewing XML documents. All that is needed is a custom client space. Below are listed the basic data objects necessary in each object space of the model. These are determined by either writing out a paragraph or two describing the necessary functionality or the data that needs to be exchanged. The .Net Library 2.0 contains advances in C# such as anonymous methods, which allow “inline” delegates; iterators, which add the “yield return” and “yield break” methods to reduce amount of code necessary for base collections; nullable types, which allows value types to assign null to the type instance; generics, which allow templated base classes for collections of any .Net object type. Look for coverage of these new features, coming soon.
Client Data Objects
HttpWebRequest
HttpWebResponse
WebRequest
WebResponse
XmlDocIn
XmlDocOut
EncryptionKey
EncryptAlgorithm
RequestQueue
MemoryStream
SecurityPrincipal
ThreadPrincipal
RequestType – complex
Server Data Objects
PortNumber
IPAddress
IOPermissions - ACL
WebPermissions - SSL
XMLParser
HeaderLookup
ValidationRegularExpressions
WebService
Connection Data Objects
HeaderBlock
AccessPermissions – ACL\Thread
EncryptionAgorithm
EncryptionKey
NetworkStream
FileStream
XMLFactory
Subscribe to:
Posts (Atom)