Sunday, December 03, 2006
I value the thoughts put forward by the author. Who does not know that over reacting or being short tampered is bad? However at times we loose our head and end up inflicting worst to ourselves. The post in some sense also touches the very old thought of “being positive”. As some one has said, “B+ is not just a blood group”. How true. What ever situation you are in, contemplate it properly before reacting on it. Showering anger on your family members, friends or colleagues or stressing up one self would ultimately affect your own physical, spiritual and social wellbeing.
Saturday, November 04, 2006
“The evidence, scientific as well as anecdotal, seems overwhelmingly in favor of deliberate practice as the source of great performance.”
The arguments in the article are quite true. In retrospect, if I look back to the days in school, university and last few years as Software professional, I have developed certain strong skills in programming and software development. And all the credit goes to diligent practice, lateral thinking, open to different ideas and ability to relate various topics to core fundamentals.
As article mentions “The good news is that your lack of a natural gift is irrelevant - talent has little or nothing to do with greatness.” I do quite agree with it. There are some things which one person does better than other. And if he goes on improvising them, he achieves emphatic success in that field. Elements like passion and strong interest are vital for any successful individual or business. However just having strong passion and lack of practice does not lead you to anywhere and everyone knows that. All in all the article gives insight on what is the simple formula of being successful.
Friday, November 03, 2006
Most of the India still lives with middle class mentality. The idea the prevails there is to get good grades in high-school, get admission in the best engineering school in the course which has best job prospects, and finally settle down with the “job for life” without even giving a damn thought of what are you interests, do you really want to do Computers or are you really destine to be theater artist. All that because there is lot of competition, a never-ending fight for survival and no room to introspect and think about what one wants. All that aside. Coming back to the engineering education system in India. I didn’t go to best engineering school but the one I joined had decent reputation at the state level. I joined Information Technology course without knowing a bit about what it means. Just knew that it’s to do with computer, is an upcoming field and has lot of job prospects. The college I went has a decent line up of the subjects. I learned courses ranging from Civil Engineering to Engineering Drawing to Simulation and Modeling to Compiler design. Honestly it was a good mix and I am realizing some value of all I studied back then. Seeing the other side of the coin, that is the teaching methods and how stimulating the environment was etc. Teaching methods were in someway exam oriented or rather to get things done or to just complete the course. But that said, the faculties were open for the discussion and ready to help one in pursuing his/her ideas or thoughts. What I feel is that lot depends on the individual rather than the system. System does play a great role but it is meant for masses and evolves over the period and influenced by lot socio-economical factors. The engineering education system in India calls for a revamp, however we don’t see it happening any time soon. The 50 years of legacy and the mindset would take time to change.
Sunday, October 29, 2006
Monday, October 09, 2006
So don’t count much on the Google Code search for the solution of any programming problem you have. I think it will take some more time when we have the search engines, which will take a problem definition something like “Write a program in PHP to convert degree Celsius to Fahrenheit” and give you a ready made or close to ready made code. Still a handy utility.
Sunday, October 01, 2006
Why I joined Oracle India Pvt Ltd?
I joined Oracle India Pvt Ltd in Dec,2003 and worked there till Jan,2005. I was part of internal IT team over there. So after spending close to one year in small startups, this was the first time I started working for a big corporate. And working with big software companies is some thing all of us in software field dream of. There is always this curiosity to know how big this companies are, what is the culture out there, the systems they use, the processes they have, the methodologies they follow for there product developments, how does their sales work, how do they support their products, the benefits they have for their employees, the campus and at last but not the least, the people they have. My reason for joining Oracle India was to gain some insights on how a big corporate works and to be part of their culture, the team, and to groom both my technical and soft skills.
My Experience at Oracle India
First and foremost thing, which I gained at Oracle India, was to develop strong interest in Data Warehouse and BI technologies from Oracle. After a comprehensive boot camp for one month, I was put into Business Intelligence team of Apps IT. Apps IT is the group within Oracle which looks after internal implementations of Oracle tools and Applications which it uses to support its one business functions. BI team was supporting wide range of BI and data warehousing applications, which were used by the global user community. I was mainly part of Analyzer implementation (both OFA and OSA). Working at Oracle was an enriching experience. There were great bunch of people around you, ranging from the ones with strong technical skills, to the ones who are sage advisors to the ones who are extremely good project managers to the full time party animals.
Learnings at Oracle India
There were strong technical learnings. On job, I learned Oracle Express and Analyzer product families. I developed strong fundamentals of this tools and how to best use these products for a particular business needs. Being part of BI team, I also got to learn other products like Discoverer, Oracle Warehouse Builder and EPB (Enterprise Planning and Budgeting, a next generation product which would replace OFA). I had significant exposure to Oracle database, PL/SQL, SQL and SQL tuning. Best place to learn any Oracle tools and Applications is Oracle itself. And that’s what I realized when I joined Oracle. It’s not just the resources like documents, webinars, product demos etc. to which you have access to, but the best part is the people around you who are always there to help you. Apart from BI tools and Oracle database, I had a significant exposure to Oracle eBusiness suite. Thanks to the boot camp, which covered things like Oracle Forms, Oracle Reports and AOL (foundation technology of Oracle Applications.)
Above to the technical skills there were loads of learnings at soft skill front. Over the period, I gained great amount of confidence both in the written and verbal communication. Honing my interpersonal skills was one more advantage of working with Oracle. Apart from this,
Why I left Oracle India
After saying all the above and ushering lot of praises about Oracle, its very difficult to justify why I left Oracle. And that too, just after 14 months! My work at Oracle involved both supporting the existing application and enhancing them and also rolling out new applications. Though rolling out new applications was 25% of what I was doing, but that was what I enjoyed most. Most of my work involved supporting and enhancing the existing implementations. However, I was looking for an opportunity where I was part of end-to-end implementation of some BI or data warehouse solution from scratch. That’s something I was not seeing in the horizon over there. Also the work at Oracle was not that intense. It was too chill-pill. I was looking for more challenging, more customer facing, more intense profile, a bit more pressure and a bit more ownership. All this culminated into leaving Oracle India and joining Ness Technologies at Singapore. More on my experience at Ness Technologies some time latter.
I am writing this post almost two years after leaving Oracle India. All what I have said here could have been influenced by the time I have spent in Singapore and at Ness Technologies. However I have tried my best to put the thoughts as accurate as what they were at the point of time when I joined Oracle and left it.
Of late I have been thinking about this a lot. How is developing a software product different from building a custom solution. And quite often indulge into casual discussions with some of my friends out here in Singapore. I was thinking of putting together a well-structured elaborate post on this topic, but seems that it would take a while for me to sit down and collate all my learnings, discussions and thoughts on this. For now just thought of putting up random thoughts on this topic.
First of all let me put on the table, what I mean by Software product and custom solution. By software product I mean those set of software, which are build to sell off the shelf. These are the software, which are NOT developed for any specific customer, but for general use. Of course this software comes with various features and functionalities out of box and there are some points of extension and customization. The software product is installed or setup or implemented by the customer to address their specific business or personal need. There are tons of ways to classify the software product but the details are outside the scope of this post. But the basic principle of software product or to that matter any product is that it is meant for a particular community and not a specific customer. Needless to give the examples but nevertheless here are few: Microsoft Office, Acrobat Reader, Oracle database 10g etc
The other category is what I call custom solutions or custom software solution to be precise. By this I mean the software build to address a custom need. This solution is targeted for one particular customer or a particular requirement or a specific business processes. The software (or the solution) is not build to sell of-the-shelf. Even maintaining existing enterprise applications and enhancing them falls under custom solutions. A custom solution could be one build from scratch like a Web Based Budgeting and Reporting solution build using Java and HTML technology. Or it could be extending the functionality of standard applications by adding necessary customization.
I do acknowledge that there is no easy way to classify all that is happening in the software world into these two categories. There are some things, which are in the boundary and don’t really fall straight in to one of this category. We keep hearing things like frameworks or blue prints, which are sold commercially these days. These frameworks are in turn used to develop custom solutions around it. So where do they fall? Similarly there are offerings forms Google like Gmail, or Clendar or to that matter any web service, which is out there in Internet. Should we call it a software product or a custom solution? Lets us keep ourselves away for all these debates.
Getting back to the main idea of the post: how does developing a software product differ from a custom solution.
Developing a software product according to me is:
- More organized and planned engagement.
- Lot of planning goes before embarking the actual development.
- There is lot study done in terms of gathering the requirement.
- Things like Focus group and user community are foundation for this product
- Involves Studying features and functionalities of other competitive products in market.
- Most of the time, a prototype or a proof of concept precedes the actual development
- There is lot of flexibility while developing the software product it terms of choosing the technology, etc.
- The teams are relatively bigger. Flexibility for the a developer to be focused on particular module or set of module
- The quality process is more elaborate. Lot of testing both internal and external is done
- API and maintaining reusable components libraries is critical
- Wide adoption of software best practices and better design principles
- Bug fixing and support is more comprehensive
- Licensing and software delivery are important consideration
- Software packaging and installation mechanism are important component
- Change control is ubiquitous
- System design documents are user manuals are of good quality. Also the documents are constantly maintained with the new release of software product.
On the other side developing a custom solution is:
- Planning process is not very elaborate
- Studying of requirement is not very comprehensive. At times requirement gathering processes continues long after the actual software development has started.
- The scope of the solution at times is not well defined
- Prototypes or proof of concepts is not that frequent.
- In terms of choosing the technology or tools the decision has already been taken by the customer or the during the presales exercise
- The teams are relatively smaller with one person playing multiple roles at times.
- API and maintaining reusable components is not that critical if the custom solution is relatively small.
- In larger solutions there is proper bug fixing setup. However for the smaller solution the bug fixing and patching is very adhoc
- Licensing is not of that grave importance. However there is lot to deal with in terms of SLA (Service level Agreement) while delivering a custom solution.
- Software packaging and installation mechanism are not of significant unless the solution is build for large user base, which needs to install the software on their PC.
- Change control in place for most medium and large-scale custom solution.
- Design documents are part of any custom solution project however at times these documents are not maintained with the new releases or the changes in the existing release.
All said, the above does not always stand true. With the smaller software product developed with Agile methodologies do forego some of the items mentioned in the software product development. Similarly with the large custom solution developed under a strict quality process does follow lot of the items in the software product development
Saturday, September 30, 2006
RSS feeds have been one of the great revolutionary things of Web 2.0. I think after Email, IM, the RSS is next great thing, which happened in Internet. Sometime back I attended the presentation at a conference talking about shift in the way we deal with Internet. Instead of pulling the information out from Internet, we would be pushed with the latest information, which are of our interest. That was 2003. And within few months of that conference, technologies like RSS and ATOM came into existence. And within few months they have became so prevalent or pervasive. RSS is getting applied everywhere.
Talking about the ways RSS technologies is being applied. News Sites, Blogs, Forums, Corporate websites are few examples of where RSS is being applied. Now a days even all the social networking sites like Flickr, MySpace and email services have RSS feeds. With RSS, you can create one dashboard which you go every day for monitoring the new content or activities at different blogs, forums, news, any new photo getting uploaded at Flickr by your friends or any new job that got posted on the a particular job portal matching your criteria. In some places RSS feeds are taking over the old Email alerts, which of late has got bad reputation of spamming your mailbox. RSS is a new way to market the new features or inform the users of the new launches.
Monday, September 18, 2006
- Tabbed mail browsing
- Right click menu shows the custom menuitem from Yahoo! and not the default browser context menu
- Split view of message list and the message content. Similar to any desktop mail client
- Compose window is quite similar to Outlook
- One can view RSS feeds subscribed in My Yahoo.
Of there are some in Gmail which are quite handy and not yet in Yahoo! mail beta:
- Attachment is not that seamless like Gmail
- Messages are organized in conversations which is till not the case in Yahoo mail
Sunday, September 10, 2006
I always have a thought of doing something of my own at some point of my career. But for now, would continue working with the corporates, garner some experiences and then would see.
Sunday, September 03, 2006
According to the article an important thing, which helps you decide is the skill set you have at your disposable. Apart from this the specific needs of the applications like the application should run on Unix and Windows platform or likes. Apart from all this, the one thing, which I would consider, is the application frameworks and reusable components or libraries available out there in the community of these programming worlds. If there are some frameworks or the set of libraries, which would fit in readily in your application, then it’s a big advantage.
Scott Berkun is always a good read. He has a unique and intriguing style of writing. And when he shares the tips on how to get started with writing and how to write better, then it should be great. Whether its writing something or doing a painting, or designing a UI for an application or sometime even writing a piece of code, for all of this you have to break a similar kind of barrier. How to get started? And once you get going it becomes easier. For me writing a piece of code appears to be easy now then what it was before 3 years. All this because I have been doing it for quite sometime. Same would be the case with writing. Once you get going for sometime, you become more comfortable. Another thing, which I felt was a major barrier for writing something, is the concern for quality of content. You read lot stuff here and there and you feel that you should be writing of the same quality. I overcame this by thinking as of I writing for my self. It does help in getting started. Once you have confidence you can expose your writings on the blogs, etc.
Friday, September 01, 2006
Monday, August 28, 2006
MOLAP applications are meant to be lighting fast. So there is lot of summarization and aggregation done in the application. Most of the calculations are part of data load and aggregation batch process. Very little gets calculated while querying the data. However there are quite a number of calculations, which are made part of formula (in Oracle Express) or dynamic calc (in case of Hyperion Essbase). Often these calculations are moving total, ranking, calculating Year to date, rolling quarter, variance etc. And ideally these calculations should be part of formula and or dynamic calc, because:
- Because this calculations are very simple and does not involve many data values to operate on
- Sometime this calculations involve applying some formula with different weight:
Now this weight could change over the period as the organization tactics change. And this formula could evolve. So it’s better to keep them on the fly.
Even though there are things which best fit in as on-the-fly calculation, I bet there would be many implementations out there calculating them in the load scripts and storing. So all the disadvantage I mentioned for duplicating relational tables holds for MOLAP too. The extra logic needs to be written, tested and maintained. Any debugging would have to check the base data, and the intermediate data. If there are any bugs in the loading script, the data needs to be cleaned up, reloaded and re-aggregated. Would it be dynamic calculation and if there is any bug, only the formula needs to be changed no reloading of data is required.
At times there are scenarios were you want to expose a cube with the respect to fewer dimensions and not all. In OFA (Oracle Financial Analyzer, a Financial reporting application using Express as backend) there are times when FDI (Financial data item a.k.a. variable or cube in Express) are required to be exposed with fewer dimension and not all. So for this scenario also, one should avoid creating a stored FDI and replicating the data in it. Better approach would be to have the data rolled up in the base FDI using hierarchies and creating a formula FDI on top of base FDI with limiting the dimensions to “Total” for those which are not to be exposed for analysis for that FDI.
Lets take an example where dynamic calc could enable you to achive reporting the aggregated data. You have a following data model:
Market (with hierarchy All Market -> Region -> Country)
Years (with members like 2006, 2005 etc. No hierarchy)
Product (with members P1, P2, etc. No hierarchy)
Gross Sales <Year, Product and Market>
Tax Rate <Year, Market>
How could we dynamically calculate Net Sales <Year, Product and Market> from the above?
Net Sales = Gross Sales * (100/ (100 + Tax Rate))
Lets see what’s wrong with this formula? Tax Rate is something you can’t aggregate on Market. Tax of India + Japan + China + Singapore etc. would not give you the tax rate for Asia. Neither would the average of this give you. So you don’t have tax rate for “Asia” and hence above formula would not return you the Net Sales for “Asia”. All you might need is another stored cube called “Net Sales” which you can use to store the “Gross Sales” less “Tax”. The cube can then be aggregated to return the numbers at Asia or All Market.
Not trying to contradict what I mentioned earlier. For this scenario and similarly many others, there is no direct way to achieve using the dynamic calc. In Oracle Express you can end up writing a SPL code which can do so dynamically, however in nutshell it would be adding the net figure of India, Japan and so on and returning the result dynamically. This could slow down the performance depending on the number of members you have. If you have 220 markets, the All Market level calculation would take quite long to come through.
To recap what I said in this and last post, is that there are different scenarios where one gets inclined to replicate the data either with minor transformation or aggregate. However one should consider doing this only if the on-the-fly calculation can not achieve the said results with permissible performance.
Sunday, August 27, 2006
Few Web 2.0 things, which I started following in last few weeks. Techcrunch is one of the best tech news site I have found in the recent times. It is sort of one stop shop for all the tech and web 2.0 related news. The content is very concise and complete. It’s always good to browse through happenings around in the tech industry. Digg is another news site. Though I heard of it few years back, but newer used to follow things over there. In the last few weeks I have started reading through the stories at Digg. Like Tech crunch, even this is sort of one stop shop for tech news. Most of the times, the stories in Techcrunch and Digg do overlap. But Digg gives a different taste. The commentary going around is good to read.
Few more handy Web 2.0 utilities which I started using in the last few weeks. Guess what? All are from Google. Calendar , Reader and Writely. The altruistic Google has been generous as like ever before. As I always write, they are showering best of the collaboration and personal management utilities, all as SaaS and for free. The usability, the UI, the features and the functionalities, all unmatched compare to any commercial desktop tools. Of course there are quite a number of limitations compare to desktop tool some because this tools run from browser. But the best all this tools have is the Web 2.0 community related features. You can share the calendar, you can view others calendar across the globe, you can create the documents in writely, share them, co-author the a document by multiple people, maintain versions, blog them, publish them as HTML and PDF and what not. The Google Reader is an excellent RSS feed aggregator. Initially I was big fan of My Yahoo. It was very easy to subscribe the feeds and collate them at one place. Easy one page snapshot of all the feeds. Easy to customize the interface. However, you cannot read the original stories for this feeds in the same page. Got to open it in the new window or so. With Google Reader its damn easy to read through the whole feed. At least for those who publish the whole story or article in the feeds. TechCrunch is one and most of the blogs at Blogger do publish the whole article in the feed. So its quick to scan through the whole story without switching the tabs. However what I miss in Google Reader is to see all the feeds in one page, in one view something like My Yahoo. Probably there is a way you can do it. Let me explore in the coming days. For time being I am using both My Yahoo and Reader. Of Course one can look for the elaborate reviews of these tools on Internet and blogsphere.
Saturday, August 26, 2006
Duplicating the data in an applications. Does it hold good always? I won't be talking about the whole range of application but focusing on data warehouse and reporting applications. If we go by some of the definition and design principles of data warehouse, they mention about redundancy in the data model. One of the golden rule for designing a typical data warehouse is to have the data model as a simple star schema. Star schema with denormalized or redundant dimension tables. The technique is also referred as Dimensional modeling. There has been a long running debate on what is the better approach for designing a data warehouse: Dimensional modeling or ER (traditional 3NF) modeling. Let us keep ourselves away for this for now.
So what I want to cover here are:
1. Duplicating fact tables or dimensional in the same data model with some calculation or some minor transformation.
I have seen this quite a times in my previous experiences. Duplication has all inherent disadvantage like duplicating in the efforts to write the loading program to populate this extra tables. This new program comes with its on effort to debug and test them. Extra space for these tables is one more thing. However the storage has negligible impact since the ever storage cost is ever decreasing and improvements in the RDBMS technologies. The most important thing, which I hate about the duplication, is the maintenance. Every load, if you encounter an issue in particular report, you have to trace down from the report to the intermediate fact table to the base fact table and source. Having these extra tables would always keep you giving and giving in terms of troubleshooting and debugging them.
2. Summary tables
There is always tendency to convert a long running query to a summary table which can then be directly used by report instead of querying on the base tables. What you end up doing is writing a PL/SQL routine or ETL mapping to populate this. All right. It does solve the problem. The performance would better up. After all summarizing the data and giving the aggregate picture is one of the important principles under pinning data warehouse philosophy. However the problem it brings along with it are same as point 1. You need to develop, test, and maintain this extra logic.
So how to remediate this? Possible ways:
- Make the reporting tool use the base fact with the necessary transformation and calculation be part of report query.
- If the reporting tool does not support some transformation functions whic are available in the database or you would like to keep report query simple, encapsulate the transformation or calculation logic in database view. Database views are most handy feature I have used in my last project.
- If the transformation or calculation logic is very complex and simple SQL is not sufficient to achieve it. Use features like table functions. Table functions allow you to encapsulate a complex PL/SQL logic in a function, which returns a table (or collection of rows). On top of this table function you can create relational views, which in turn could be used for reporting.
- Materialize view is another way to tackle this kind of situation. All the above 3 approaches are the on-the-fly approach and can lead to performance issues. Materialize views could be another option to consider if there are such issues.
As mentioned reporting aggregating and summarizing the data to have high level picture and then able to drill down to the transactions is the one of the core principle of data warehouse philosophy. Most of the design principles and technologies are how to make this more efficient. There has been range of new technologies catering this particular directly or indirectly. The new products are being churned out from both traditional data warehouse vendors like Oracle (Oracle 10g OLAP option, Materialized view), IBM and MS (SQL Server 2005). Also there are new offerings from pure play BI software vendors like Hyperion, BO, Cognos. Above all there are firms like Hyperoll etc. are just catering things particular thing. The details of this offering are outside the scope of this post.
All I covered in here is about relation data warehouse. I would cover similar caveats in MOLAP applications in one of a subsequent post.
Today, I stumble upon this news story is about eLitecore's expansion in middle-east. Ofcz thats a good news for eLitecore. They are a great company with the great bunch of products and great people.
eLitecore is a mid-size Software development house based in Ahmedabad, India. The company has a long and vibrant history. I know fairly good about them since I worked with them for couple of months and reading the news story took me back to good olds days at eLite. Couple of months, huh.. was it a consulting assignment? No. Actually I joined them on permanent rolls but just 2 months down the line, got an offer from Oracle India Ptv. Ltd. Since it was early days of my carrier and probably thought that working for a bigbrand would add lot of weight to my resume. And thats it, I quit. That was the end of my spree of working with startup. In 2003, I worked for Rightway Solution and eLitecore. And the learnings were one of the best, a rookie in IT field can get. In Dec,2003 I joined Oracle and since then have been with big corporates. So kindda have a flavour of both startup and biggies and how different its working with them. Ofcz it goes for another post. Probably in the comings weeks would write on this, since there are lot of things to share over here.
All that aside, lets come back to eLitecore. eLitecore has its genesis from IceNet one of the first ISPs in India during the early days of Internet in mid nineties. eLitecore started with developing network management and billingsoftware for Icenet and latter went to become a leading product development and services company. There are quite a few successful product based company in India (most of them are into services) and eLitecore is one of them. Part of eLitecore's business was services but product development has been their focus all along. Their primary product lines deals with network management, network security tools and billing software for small and large scale ISP. Their flagship product Cyberoam now branded as UTM has large install base in both domestic and international market. The product comes with tonnes of features like configuring Internet access policies, bandwidth sharing among the application, security features against attacks like Phising and Pharming, loads of reporting to get better insight in the Internet access patterns, VPN and the list goes on and on.
Coming to my stint at eLitecore, I would say those were one of the best days in my IT carrier. I got to learn lot of the insights about developing a product. eLite follows Agile methodologies for product development. The team size is usually small so there is lot of things on your plate to do. There were earnings on how to
- architect products
- creating the library and reusable code base
- how to leverage on open source
- how to make best use existing libraries out there
- designing data models
- Usability design of the systems
- developing a tool typically for network access management
Of course I didn't gain mastery in all the above. The learnings were not that intense for some of the above items, but I did get good starting point on the considerations for developing a software product.
Though after eLitecore, I never worked in network management tools, or Linux or even in Product development of late. All I am doing is technical consulting and implementation of BI and DW products. However the learnings from eLitecore are instrumental in my day to day working.
Tuesday, August 22, 2006
One great point which the speaker brings up is how India can help reduce down the cost of innovation and diffusing the innovation by its large pool of skilled labor and booming manufacturing and other sectors like pharma, services etc. He mentions on how ecosystems can interact with each other by complementing and contributing to each other. An example he gave was that IT and Automotive industry in India are concentrated in 3 parts of India that is Delhi-Noida, Mumbai-Pune and Chennai. The point he makes is how should the automotive industry leverage on the software sector for innovating new technology know-how of doing the things in better and efficient way.
The talk also brings up a great thought of making IT and software affordable for the 80% of underprivileged population of the world. He gives an example of a village from central Madhya Pradesh were the farmers have started using computers and Internet to keep them updated about the weather or using the latest technology and novel farming practices or the latest prices of commodity goods at Chicago Trade Exchange which in turn would help them get the better prices for there goods. This particular thought is close to what Vinod Khosla said in a conversation at Web 2.0 2005 conference held in Sept,2005 where he mentioned of using Internet to deliver the high end education to all the underprivileged people in the remote corners of the world. An striking example he gave was to have a sort of remote Harvard university where 400,000 are listening to a lecture from a eminent scholar from remote corners of the world.
Overall the keynote bundles great thoughts. Must listen.
Tuesday, August 08, 2006
For me, I just follow Oracle BI tools and these days Siebel Analytics and some of the related offering. Also had spend some time with Hyperion product line in my previous assignment. But it always good to know some bits and pieces about offerings form other vendors.
Today I ran into this whitepaper describing SAP BI Accelerator. The whitepaper focuses on how does BIA fits in the typical SAP BI implementation. I had hardly followed SAP BI offerings in past, except that I read some news group discussion on SAP BW vs other data warehouse technologies and stuff. However this article did give me some insight on what is SAP NetWeaver BI and how a typical implementation of SAP BI works. Sort of high-level view of what could be the architecture of SAP BI. To me, SAP BI closely resembles to Siebel Analytics. Setting up SAP BI would consist of defining the data model, which could consist of either relation tables (ODS) or dimension data model (Infocubes).
Like any other relational OLAP implementation, SAP BI allows you to create Aggregates and define the metadata for the same. This will allow the query to be redirected to them instead of base tables. However, this aggregates like any in Siebel Analytics needs to be maintained outside the SAP BI purview. Also these aggregates are pre-defined and can cater to limited set of queries. So here comes BI Accelerator for the rescue. With BIA setup, you no more need to set up the aggregates. The BIA would in turn use the inbuilt proprietary technology to aggregate the data and cache it for subsequent hits. The concept behind BI Accelerator is close to HOLAP as author Naeem Hashmi from Information Frameworks says
“For Data Warehouse pros, the concept of BI accelerator is similar to good old HOLAP, although the technology and approach is radically different. Meaning, the content is transformed into proprietary structures in another layer on top of Relational-OLAP implementation. User access layer sends incoming queries to HOLAP for quick access/navigation instead of Relational-OLAP. The only difference here is that BI accelerator uses powerful search engine technology, transparent to traditional data warehouse end users.”
The white paper is a good read and gives some fundamental insights of BI Accelerator.
Thursday, July 27, 2006
Wednesday, July 19, 2006
The next level for this evolving concept of Application Service Provider, Software OnDemand probably is naturally SaaS. During last couple of years I have heard about loats of software being delivered as service. Specially those related to content management and collaboration. Writely, Google Sheets, Salesforce.com, Basecamp, Sprouit.com and list goes on and on.
The article talks about nuances of SaaS and other related concepts like ASP and OnDemand. The marathon story covers what the big players like SAP, Oracle and Microsoft are doing to address SaaS in there next generations product line. The article covers how Salesforece.com fits in SaaS model while those offered by Oracle OnDemand and other vendors are far off from SaaS. The key thing which this article brings on table is the architectural principle of multitenancy, which means a single instance of the software runs on the provider’s servers, and all users log onto that same instance. The article goes on to talk about how SaaS cultivates a Web 2.0-like community. All and all good read and gives loads of insight on the SaaS landscape.
Tuesday, July 18, 2006
Web application development has always fascinated me. The first nickel I earned after joining Information Technology Engineering was by developing a web application for my friend's uncle. In the initial days of my IT career I worked with startup Rightway Solution designing and developing small and medium sized web application. And man, those were exciting days of my career. That was the time of PHP. And ofcz PHP is a cool technology even now.
So developing Web based applications has excited me always. Few days back I was following some of the articles on trends of Web Development in 2005 and how the Web Development landscape's gonna look like in year 2006. And one of this article mentioned about R0R. Googling on RoR took me to this article from Curt Hibbs. The article gives you a head start on getting along with RoR. The RoR is a cool framework build on top of Ruby. And Ruby itself is a cool language. Damn powerful and as its mentioned in http://poignantguide.net/ruby/ close to what human speaks.
As it says ...It is coderspeak. It is the language of our thoughts.
Read the following aloud to yourself.
Thinking of spending some time with RoR in the coming days and may be do some hands-on during the weekend of how to use this technology in developing fairly complex web applications and leveraging the framework for a typical custom build reporting and budgeting application.
Stumble upon this very good blog on Oracle Applications BI modules. Nilesh Jethwa, the author of the blog covers various topics related to Oracle Applications in general and Oracle Apps BI modules in specific. There are posts on EPB, DBI and likes, for which there is hardly any material available outside Oracle own site. Author also runs a parallel blog at ITToolBox on this url. There are posts which are covering the bare tables of Oracle Applications which you should be using to extract different data like GL Balances or the tables in AR which could be used to extract sales data. On other side there are posts like this on What is ERP? which covers grave IT topics in layman terms.
Like Nilesh, I have worked for Oracle Corporation. I have been working on Oracle Business Intelligence Applications (OFA, EPB and Sales Analyzer) and other BI tools (OWB, Discoverer, Hyperion Essbase etc.) for quite sometime now. So it would be interesting to follow this blog. May be I end-up drawing some inspiration to put some good post on this space. I have put to gather a list of topics on which I intend to write something. However the list gets bigger and bigger and I hardly manage to get some time to write on some of them.
Friday, April 28, 2006
Thursday, April 20, 2006
Tuesday, April 04, 2006
Stumble upon this post, which talks about what how the next generation of User Profile management by Internet services giant like Google, Yahoo, Microsoft and others would lead us to. The post talks about this original article which appeared in MSN. Seeing the pace at which thing are moving in the Internet arena, and the Web 2.0 revolution, the Internet experience can get more personalized and become intimate part of out day to day life.
Few more new things which I came to know today was digg.com and www.technorati.com. Digg is a very good Technology news, blogs and RSS aggregator. While technoratti.com is a search for the Blogosphere. Both are great webservices.
www.feedburner.com was another thing, which I stumbled upon few days back. Feedburner helps is great utility for feed management.
Every now and then I get to know about new kinds of Internet services mostly around Collaboration and Social networking. It sometime gets difficult to cope with all this.. But anyways its flow and one has to flow with it. No doubt, this new breed of services are really useful and have taken the Internet experience to the next level.
Friday, January 27, 2006
The discussion covered fundamental aspects of OWB and Informatica specifilcally and ETL in general. Worth read. Below is interesting snippet. By the way the whole discussion is worth reading
What was really of interest to me was how well the tool works in harmony with the other pieces of the data warehouse (i.e. relational sources and targets as well as external files). And for me, here is a critical point: the Informatica approach to data cleansing and movement is to act upon individual rows which are pulled from a source (often the database), scrubbed, and then placed into the target.
As the core processes of data movement and cleansing continue to be integrated into the database engine, I believe third-party vendors like Informatica will find themselves more and more on the outside looking in. Tools such as Oracle Warehouse Builder are basically frameworks around the database engine core ETL functionality and enhance user productivity by providing the glue to make all of the technological pieces work together. Is it on par with the slickest tools out there? No, not yet.... but "Paris" looms on the horizon.
Thursday, January 26, 2006
Thursday, January 12, 2006
Wednesday, January 11, 2006
Scott Berkun, is not more guy whom I am following since an year. There are lots of discussions on Project Management, Usability Engineering etc. A series of essays Scott has written are worth read for any one from the software field. Also the PM clinic and UX clinic are good read.
Of late I also started following yet another BI community at http://www.b-eye-network.com. There are interesting set of articles and news related to BI. There are specific vertical channels like Retail, Life sciences that have good bunch of articles worth reading.