Thursday, April 19, 2007

A legitimate alternative to passwords?

| Tech Sanity Check: has a commemtary about yet another Multi-factor Authentication scheme.

vidoop has this interesting scheme for multi-factor. The interesting twist in this case is that the scheme has the potential to be based more on how you think than a known fact. You might select a sequence of pictures containing boats, airplanes, and cars. The theme could be transportation, the colour blue or aluminium.

They aren't there yet - it seems their current scheme relies on you thinking they way they do. But it does strike me as an improvement over static images.

Sunday, April 08, 2007

Shared Items

If this works out right, you should now see a shared items feed near the top of this page. What are shared items? Well many sites produce a summary of their content in a form that can easily be syndicated - that is consumed by other sites, or applications. Often referred to as a 'feed' or an 'RSS feed'. RSS is a particular protocol used to implement this feature. ATOM is another such protocol, although they often get lumped together and called RSS.

So what can you do with it? There are many ways by which you can consume these feeds. Internet Explorer 7 has such a feature built in. So does my.yahoo. My favourite mechanism is Google Reader.

If you know somebody who reads a lot of internet content, you may wish you could 'read over their shoulder' every time they say 'Hmm. That's interesting!' Well that is exactly what Google reader shared items let us do. If I find something if interest, I can mark it 'shared', and it will then show up on my shared items feed. If you'd rather just look at a web page of the same items, you can do that instead.

There is one thing I would like to be able to do, that 'shared items' doesn't allow. And that is to make a comment on an item I share. Well maybe we don't need that. That's what this blog is for!

Have great day.

Thursday, March 22, 2007

Wednesday, February 07, 2007

Picking the Open Source Winners

This is a big topic for me at the moment. I am very interested in the OpenBRR and Navica scoring methods and the work undertaked by ohloh and optaros.

Wednesday, January 31, 2007

Authentication, Authorization, and Context

I was reading James McGovern's blog today and it reminded me of a conversation I had at work yesterday. James focus is on vendor product - something that is certainly of interest to us, but beyond that we also have to deal with our internal applications.

There are three issues that always pop-up when we try to integrate a new software product.
How do we authenticate?
How do we authorize?
What was the user try to doin the first place?

We are starting to get a good handle on #1 - but I would like to see us authenticate at fewer points and establish a trust network between applications. SPNEGO, SAML, WS-Federation, Liberty, and perhaps OpenID are all promising.

The James' blog speaks to the second part - authorization - The next big frontier. What roles can the principal hold (for this application)? The application part of that sentence is ways controversial. This is where XACML fits in.

The third piece - what were we doing is a tongue-in-check reminder that the user was actually trying to do a job before security "got in the way". The user likely had some kind of established context that should, ideally, be available to the next application. Although not always the case, it is a frequent requirement. For example, the user may have been working with a customer in the CRM application, and now needs to work on the customers Loan in the credit management application. This customer and loan context information needs to be carried forward. There are no good soutions for this that I know of. This is the undiscovered country.

Any thoughts out there?

Friday, January 19, 2007

Will we need a Fortress for our cores?

Two news theme over the last several months strikes me as mutally relevant.

First there is the information coming from Intel about future core densities. The Gulftown processor will likely make its debut in 2010 and will contain 32 cores. Intel research is also working on an 80-core research prototype. As the InformationWeek article discusses, software will need to change to accomodate.

Speaking of software, Java is a significant development language for the business world. What will business applications do with an 80-core engine? Multi-threaded applications - wherein the threads are explicit to the programmer are not the way to go. To the typically programmer I say "You can't handle the threads".

So what is the second news theme? Fortress - a research effort coming out of Sun Research. It is targeted at High Performance Computing; A replacement for Fortran. Am I suggesting that we rewrite all of our applications in Fortress? No. But Sun is implementing Fortress on the Java Virtual Machine. So there are some good possibilities that future Java language features will be able to drive out the same multi-threaded behaviour for which Fortress is striving.

Thursday, January 18, 2007

Bank loses data

Bloomberg.com: Canada: "CIBC's Talvest Mutual Fund Loses Client Data Files"! This cannot happen too many more times before companies will be forced to put the technology in place to ensure that data that leaves their premises is encrypted. That won't be cheap, but it seems that the cost is inevitable.

Historical revisionism

Historical revisionism - what does this have to do with technology. Well last month I was trying to figure out how to make Sametime display my picture. I found a very informative article about exactly that process. I thank the author for publishing the information.

However I was shocked by how difficult the process is. Perhaps suitable for the corporate world, but still awfully painful - especially when you compare that process to the drag-and-drop simplicity of MSN Messenger.

I made a civil comment on the topic - it seemed that the author and his readers may very well be infleuential in improving the process. I thought I was being helpful. I kept checking back to see if anybody responded to my observation. Sure enough. Today I discovered that my comment was deleted!

Now this is not a great travesty - certainly not in the same league as what the wikipedia artcile covers. And not the first time it has been reported in the online world. New York times artciles and Whitehouse pages have suffered the same fate. But it is the first time it has happenedto me.

But this is the internet, and this blog is the solution. Now you all know that changing your display picture in sametime is very difficult - and I won't delete your comment - unless it has something to do with Russian brides ;-)

Monday, March 20, 2006

The future of the ESB

Loek Bakker's weblog: Understanding the future of the ESB: "It's so simple: ultimately, when all the parts have the ability to work together (i.e.: they all can communicate through messages, which will be the bus), the role of the ESB will be like the role of DNS for the Internet: addressing and routing. Nothing more, nothing less."



I suggest that he overstates the role of DNS. It doesn't really do routing. That is the job of routers and switches.



I generally agree that the capability we want for the end-points to do the processing, so that the actually communicaiton flow is point-to-point. But that point-to-point connectivity is not exposed to the application. I want to be able to modify application behaviours and policies as if it all went through a central hub, but don't want to actaully make a network hop and who-knows-how-many context switches in order to perform any routings and transformations.



Using DNS as an example, browsers look-up (resolve) domain names only occasionally, and then direct their traffic to the end-point directly. In some cases the end-point may do some virtualization and wrokload management at the far end, but all of that is of no consequence to the requestor. This is the way the applications should work with an ESB.

Tuesday, February 21, 2006

Tim Bray On PHP

"Spaghetti SQL wrapped in spaghetti PHP wrapped in spaghetti HTML" Ouch. Tim certainly has an opinion here. I question where this has anything to do with PHP. One could certainly create the same kind of pasta using JSP and is the way many ASP sites look inside. I suggest the difference is not the language, but the focus of the developer.

Friday, February 17, 2006

Dan Bricklin's wikiCalc

Dan was the original author of VisiCalc. He is at it again with wikicalc. An online spreadsheet. I like the idea, the alpha code is somewhat functional, but needs polish. Great work Dan.
Software Garden Products: wikiCalc Program: "The wikiCalc program is a web authoring tool for pages that include data that is more than just unformatted prose. It combines some of the ease of authoring and multi-person editing of a wiki with the familiar visual formatting and data organizing metaphor of a spreadsheet. "

Thursday, June 02, 2005

Designing Software for Your Market

One thing that bothers me is the way that many IT shops go about desginin infrastructure or shared services. The scenario I often see is that rather than defining who/what/where the market is and providing a solution that addresses that need, the approach is more often "How can I convince those people to use what I have or want to build".

To me the difference is the lack of a product development team (in the marketing sense). Once that is in place then they must work in conjunction with engineering.

It seems that even Microsoft sees the need to improve this model within their own organization. In the course of developing the next release of Exchange, the two departments wrote a book together. Bink.nu: "It encompasses everything from the market outlook to the perceived value of possible features to potential customers. At its essence, the book serves as a contract between marketing and engineering, describing what goes in and what stays out of the software."

This seems like an approach that more should model.

Wednesday, March 09, 2005

Can Google help Firefox?

Steven J. Vaughan-Nichols in his artcile Firefox Is Heading Towards Trouble said, "If the Mozilla Foundation and Firefox friends like Google don't start spending money—right now—to hire more programmers, more project managers and more servers, it won't matter how many ads in the New York Times Firefox supporters take out, Firefox will have already reached its high tide of popularity and we can only wait for the ebb to begin." I think he has a good point. There has been much speculation about Google and the gBrowser. What there are up to is certainly not clear. But it seems a pretty good bet that they are doing something with the firefox code base. Although this may dilute the market share for pure firefox, I suspect it would help a lot with code quality, and may even server capacity.

Thursday, January 13, 2005

A good overview about the benefits of workflow

I ran across this short note from George Parapadakis about workflow benefits. I thought it was a good summary. And You Know - Of Course What Workflow Is

Monday, December 06, 2004

Software Factories

I have been reading a bit about software factories, and also recently attended a presentation by Jack Greenfield (author of Software Factories: Assembling Applications with Patterns, Models, and Tools). Many interesting ideas.

The first thing to keep in mind, is that there is a war going on. The combatants are familiar: Microsoft and a few friends on one side, eveybody else on the other. As somebody once said, "the first casualty of war is truth".

One of the main principles of the software factory idea from Microsoft is that of "Domain Specific Languages". Fankly I am having trouble accepting the idea as useful for large scale developments in large development shops. I can see how it might work well with a team of A-Team programmers. But when dealing with 9-5 programmers - those with lives away from programming, I think having a dozen DSLs within the organization will be too much to handle. These DSLs will be on top of the general programming languages we already have. At least for the next 10 years. The average Joe doesn't want to learn that much.

On the other hand, I can see the counter argument. As Grady Booch points out, " In many cases, the semantics of the UML are pretty close to what you need" (December 3, 2004), UML is not exactly what you need - just close. There is often an impendance mismatch - which means you lose power, perhaps introducing distortion. The DSL approach says create exatcly what you need and use that - the mismatch is avoided.

The difficulty I have with that is that it may just be moving the problem. Now the programmer has to deal with a dozen different interfaces (DSLs). Is the programmer more likely to make a mistake in the coding? Would there be 'Whoops - that would have been the way to code it using DSL#34, but in DSL#93 it is done this way, becuase of what we learned when we used to use DSL#67. "

Grady also said, "we do disagree with Microsoft's rejection of the UML in favor of proprietary domain-specific languages". At first is wondered from where he pulled the word "proprietary". That word to me usually implies 'closed', 'owned', 'not generally available'. I thought these DSLs could be open, public, generally available. But then when I consider the number of them that could exist, I realized that proprietary is not so wrong.

Enough for now. This Software Factories versus MDD (or MDA) battle will be ongoing for years. I don't either will win, but both will declare victory.

Wednesday, September 22, 2004

Java over-engineered?

I don't know the name of the author of this post, but he (or she) has some interesting observations. About two-thirds of the way down the post he has a list of things that are bad about J2EE - in comparison to .NET.
What I think is most interesting is that I think he is comparing typical solutions built on those technologies rather than J2EE and .NET. Ya I know - semantics.
Anyhow, he goes on to blame this on "cross-platform compromises" and "textbook OOP".

Hmmm.

In the many commercial grade J2EE code bases that I am familiar with (high volume web banking and brokerage web sites), I can't say that we have made many cross-platform compromises - very few in fact. But when we do, we usually encapsulate the dependency, and put a factory in front of it. This add layers and makes the code more opaque. That is certainly true. But, as I said, we don't do that very often. Would stored procedures made the code simpler? I don't think that would be true either. In our case, it would add another layer!

What about textbook OOP? Is that a bad thing? Must be. I am wondering if what is observed is a tendency to make things overly abstract, overly decoupled - perhaps even overly reusable? The motivation is good - planning for change. It is often done with a view to save money in the future, without looking at the ROI. I think that too often we spend two dollars today to save a dollar next year.

.NET does not have any redeeming features that will save it from the same fate - (except maybe an army of VB6 programmers who haven't seen an OOP textbook!). Given time it will build it's own empires. Perhaps with the help of Microsoft! Chris Anderson and Miguel de Icaza exchanged some volleys about the complexity of Avalon see here and here.

Wednesday, September 01, 2004

PHP versus Java

Here is an intersting tidbit burried in a post about Friendster's switch from Java to PHP. As usual the debate tends to feed on how you build the application rather than the fundamentals of the underlying technologies.

Of particular interest was this quote. in the journal "ACM SIGMETRICS Performance Evaluation Review, Volume 31 Issue 3" there was an article called "A performance comparison of dynamic Web technologies" where Perl, Java server technolgy (also tomcat) and Perl was benchmarked in a labratory environment. It was concluded that Serverside Java outpreformed PHP and Perl by a factor 8. The abstract also states, In general, our results show that Java server technologies typically outperform both Perl and PHP for dynamic content generation, though performance under overload conditions can be erratic for some implementations.

Monday, August 30, 2004

Gartner says WSDL is a critical standard

A recent Gartner Research Report, entitled 'Consider WSDL a Critical Standard'. Reinforced their opinion that WSDL and web services are a fundamental component to successful SOA. Their bottom line is use WSDL or find your self alone in an obscure corner of the industry.

More on Rich Clients...

The IBM workplace technology seems to be gaining some traction. Stephen O'grady has written up an analysis of what some other vendors are proposing to do on top of it.


Actuate's plans are intersting for two reasons: 1)it is an open source business intelligence project, 2) it is built on Eclipse.

Tuesday, July 13, 2004

Rich Client Wars

Much is being written today about the future of the Rich Client application versus the browser applicaiton. One side says you can't get a good user experience out of a browser application, and you can't get sub-second response time from it either.

There are those at the other extreme as well. The say there is nothing the browser can't do and notuing it shouldn't do.

I suggest that truth lies in the middle. I also think that the middle will shift over time - towards the browser side.

This posting at looselycoupled, by Phil Wainewright, talks about how the Yahoo purchase of Oddpost is an indicator of things to come - delivery of rich user interfaces within the browser. His main point is that Microsoft's Avalon is the wrong direction for the company to be taking.

He may be right. Time will tell. I think Microsoft has surely thought about that. I think they are hedging their bets - make the borwser irrelevant. Where you really wnat to work is in Microsoft Office (with a healthy serving of Sharepoint Portal Server and BizTalk server to help it out). Take a look at their Office Information Bridge. If Microsoft could get us to live in a purely MS-Office world, then the browser becomes irrelevant, as does Google, Yahoo, IBM, Sun, Java and Linux. Apple would still be around - after all it does run Office, and the Microsoft marketing department needs Macs to publish their material (look at the document properties for their PDFs - you'll often see something like "Mac OS X 10.3.4 Quartz PDFContext")!