""

SAP HANA

SAP HANA Sandbox Implementation for In-house Tests

SAP HANA gets you moving at the speed of light
SAP HANA gets you moving at the speed of light

It is early days, but Clariba is at the forefront here once again with our initial SAP HANA installations. Indeed, SAP HANA is positioning properly in the starting grid of the BI stack for new deployments. That is very good news for all of us who are in the SAP World. However, even though everyone is (theoretically) aware of what SAP HANA is and its main benefits, not so many consultants know what the real options for having that in-house are and therefore they are not able to start making the first tests and getting their hands dirty with the tool. This is really important as we do not usually know when a real SAP HANA project may come up in our portfolio and we want to be ready for that. The main purpose of this blog entry is to shed some light on this topic.As mentioned earlier, surely almost all of our Clariba blog readers already have heard about SAP HANA. For all those who are still late let me pinpoint the primary concepts of it. Basically SAP HANA is a Database Server taking advantage of the SAP In-Memory Computing technology. This technology empowers dramatically the entire business – from shop floor to boardroom – by giving real-time business processes and applications instantaneous access to data, enabling real-time business applications and analytics system.

Back to the point, nowadays we have 2 real options for having our own SAP HANA environment up-and-running in our office. Depending on our budget and our requirements we can go for the cheapest option which is to apply for a virtual instance of HANA in the cloud or the most expensive one which is acquiring an official hardware appliance from one of the leading hardware partners of SAP.

The first and cheapest option is to go for a SAP HANA Application located virtually in the Cloud. This is an on-demand application environment and SAP offers a range of services with that. Nowadays SAP is offering SAP HANA Database environment in Amazon Elastic Compute Cloud (aka Amazon EC2). This is a complete fully working instance which can be accessible through SAP HANA Studio with really nice performance (depending on the requirements the instance may be created with more or less HW resources).

Under this context, before creating our SAP HANA instance there are two prerequisites which have to be met. First thing is to ensure you are a member of the Sap Community Network (SCN) - you can register here. This process is totally free of charge. Second is registering in Amazon Web Services (AWS). You will be requested to use your credit card because although registering is free, the charges will come up as soon as we create our SAP HANA instance. Amazon charges money from that point on. Even when the instance is stopped they will still be charging. Regarding the costs, they can be foreseen in advance using their own pricing calculator which can be found here. Just to give an idea, the smallest instance costs 50$/month with a 25h per week usage. We definitively recommend terminating the instance once the test is completed. The entire process to create the AWS instance can be found here.

The second option is absolutely the best but also the most expensive one. The idea is to purchase an actual brand new SAP HANA rack data base server. Currently SAP is in partnership with Intel, IBM, Fujitsu, Dell, Cisco and HP companies so you can contact directly with them and check their actual prices. They have a nice variety of models in order to meet all customers` needs. It is only with this sort of installation that you will get the best SAP HANA performance. Just to give you an idea of the remarkable speed in one of those machines, our consultants loaded more than 250 Million records in our sandbox system and they were achieving less than 5 seconds of average response time for any query we run against the database. Not bad at all, is it?

Finally, I would like to mention that you might find other options in forums for using a laptop with a fresh SAP HANA installation that involve altering installation code. This solution is not supported by SAP; therefore Clariba strongly discourages you to use it. The 2 options listed above will give you much better performance.

In conclusion, SAP HANA has become a reality in our BI world. As time goes by, our SAP customer datasets are exponentially growing and will eventually exceed the abilities of their commonly used tools. Transition towards SAP HANA will be a must for them, but also for us. Ensure you are ready for that by getting to know the tool. You know what options you have now, so go for it.

If you have any tips or questions, please leave a comment below.

Make way for SAP HANA

Last month, I had the chance to attend the SAP HANA Training Bootcamp in Dubai (UAE). This was my first opportunity, after a lot of expectation and whitepapers, to put my hands on a real SAP HANA machine and the expectation were met.

Let’s start with the impressive hardware improvements that have led us to the In-memory computing revolution.

1
1

It’s not difficult to understand that the slower part of a database access is reading data from the disks. This problem has been addressed by the database vendors by trying to use storage optimizations, faster hardware, and other techniques but ultimately the database management system needs to read data from the disk. So here the great idea came to play… What if I can store all the data in memory?

Not so long ago, when our server processors were 32 bit, we were only able to address 4GB of memory, so that was the maximum amount of memory we cold have in our server. Nowadays with the 64 bits architectures we are able to access up to 2 TB.

In addition to the huge amount of principal memory we can have in the server, thanks to the massive parallel architectures (currently we can have up to 64 CPU Cores per server) we can apply a lot of optimizations that reduce data size and increase access performance. These are:

  • Data Compression
  • There is no need to use aggregate tables reducing data storage requirements.
  • We can use table partitioning to increase data access speed.

All of this gives us a near to immediate query response time, even with queries involving billions on rows, and believe me, this is not just a Marketing stunt… I have seen it ;)

2
2

So far we have seen why it is interesting to use In-Memory computing and why do it now, but, how do we manage SAP HANA Appliance?

The core for SAP HANA Appliance administration is SAP HANA Studio. Using this tool we are able to:

  • Model our information; these models are used to create multiple views of the transactional data.
  • Preview data from both physical tables and the previously mentioned Information Models.
  • Import and export data
  • Configure data provisioning (initial table loads and replication)
  • Manage the system security

I hope you have enjoyed this overview of SAP HANA appliance. Very soon I will come back with a post on SAP HANA connectivity.

If you have any questions or opinions about SAP HANA appliance, please leave a comment below.

SAP Data Warehousing Solutions: an October 2011 Review

Regardless of software or hardware vendors, the Data Warehousing market is an area of growth, expected to increase about 10.1% in CAGR (compound annual growth rate) by 2015 according to the latest study carried out by IDC. SAP, thanks to the acquisition of Sybase, it is seen as a leader in the Data Warehouse by Gartner & Forrester. This article is focused on the solutions provided by SAP for organizations looking for enterprise-oriented data warehouses or more agile high-performance data foundations and to catch a glimpse of their future road map.

Enterprise Data Warehouse

  • SAP NetWeaver Business Warehouse (BW): is nowadays a very stable product, with a large installed base and in constant growth (more than 12000 customers, more than 15000 productive systems), and what more it has not been affected by the economic downturn in 2009. If we look at its future evolution, it is intended to be running on HANA in 2012 which will be used as the underlying In-Memory database Platform (BW 7.3x SP5), this means to be based on an  already built-in Business Warehouse Accelerator.

High Performance Analytic Data Foundation

  • SAP HANA: is a flexible, data source agnostic, in-memory appliance that analyzes high volumes of transactional data in real-time. It includes tools for data modeling, data and lifecycle management, security and operations. It combines SAP software components which are optimized with the hardware provided by partners. The benefits are the following: make smarter business decisions supported by an increased visibility  of very large volumes of business information, react to business events more quickly through real-time analysis and reporting of operational data, a new application foundation for a new category of applications, streamline IT landscapes and finally reducetotal cost of ownership (TCO). If we look further  ahead in time, there will be a transition going from having SAP Business Objects BI 4.0 running on SAP HANA 1.0 to having the whole SAP Business Suite running on SAP HANA 2.0 in 2013.

 

  • Sybase IQ: is a market leading, high-performance, columnar analytics server and data warehouse specifically designed for high speed data analytics, enabling high speed performance of complex queries against large datasets, with the advantage of being a low cost  maintenance when compared to row-based systems (by reducing the need for aggregates and indexes). It is a mature and proven solution with about 1900 customers and more than 3300 unique deployments. SAP will continue to support and invest on this acceleration technology, especially for non-SAP applications and data (including the possibility of placing SAP BusinessObjects BI on top).

 

  • SAP BusinessObjects RapidMarts are preconfigured jumpstart data marts that are designed to accelerate BI. And they all come pre-packaged by subject areas and sub-areas specific to SAP modules (Finance, Manufacturing, Operations, HR, etc.) as well as for non-SAP applications (JD Edwards, PeopleSoft, Lawson, Oracle EBS, etc.). The key elements included are the following:
  1. ETL Mappings: Source-to-target mappings and data transformation for relevant source tables (initial and incremental data movements).
  2. Data Marts: set of target RDBMS objects and schemas based on best practices for dimensional data modeling (Oracle, DB2, SQL Server, Teradata).
  3. BI Content: preconfigured universes based on best practices and samples of reports displaying  the wealth of data available.

If you need any further information on the solutions presented here, don’t hesitate to post a reply or contact Clariba.