""

Technical

Active Directory SSO checklist for SAP BusinessObjects

Let’s face it, an SSO implementation is not an easy task. Even though the steps are clear to us as BI consultants, there is always the chance that something is missing or that we need to apply changes to configurations due to differences between the real environment and the manual. It is also very common not to detect such differences, as there are multiple types of environments and multiple configurations that may differ from the guides.

The purpose of this post is not to provide another guide for AD+SSO implementation, but rather to offer a checklist that you can go through when the implementation is not successful and the SSO with AD is not working as desired. This list can also be helpful when performing the task, as it is highly recommended to test all the steps during the procedure.

Steps 1 to 4 are common validations which will allow you  to fix errors that can be difficult to detect. They are related to the Active Directory Server tasks and likely need to be double checked as they are usually performed by other people (i.e. the AD maintenance team). Best practice suggests that you should plan to check every single task,  especially the ones that are not performed by you or your team.

  1. Test the service account for Kerberos delegation -> verify that the password of the account is set to “Password never expires”.
  2. Encryption to use for the account -> RC4 is used when DES is not selected. For SAP BusinessObjects implementations that are under XI 3.x, RC4 is preferred since it comes with the JDK 1.5 version. On earlier versions (i.e XIR2 with java SDK 1.4.2), RC4 may not work without updating the JDK to 1.5.
  3. Verify the creation of the default SPN -> run a “setspn –l” on the service account and check the output. Setspn –l should be something like this:
  4. Verify the delegation is enabled on the vintela SSO account -> Look for a checkbox “Trust this user for delegation to any service (Kerberos only)”.
  5. In the BusinessObjects server, double check files web.xml and server.xml -> Review lines added or modified and, if possible, redo it maintaining a copy of the original ones.  Some of the validations are: a)Server.xml -> Increase the default HTTP Header. Normally it is set to 16384 but if your AD contains users that are members of a lot of groups (50 or more), you may need to increase the.  b)Web.xml -> Change the authentication default to secWinAD when using SSO. Then remember that siteminder must be set to false and vintela to true.  Remove the comments from the auth filter. After that, set the idm.realm to your default REALM (must be in capital letters). And also set your idm.princ to the default SPN. These three last steps, are shown as follows:
  6. Verify vintela filter has been successfully loaded  -> to do that, remove all logs in the Tomcat folder after stopping the service and restarting it again. Then search in the stdout file for the credentials obtained. If the credentials are obtained, the vintela filter is being loaded successfully. If the credentials are not obtained, you may run the kinit and check the output as the following image shows:

If you have solved your problems by following the points in this post, congratulations! If not, don’t give up, keep on searching in different forums, trace tomcat (there are several configurations you can add on the console), scan packets from the clients related to SSO issues, or ask us for guidance. In the worst case scenario you may need to redo the implementation from scratch. Whatever the case may be, we’re sure that in the end you will succeed!

SAP BusinessObjects and Microsoft SharePoint Integration Capabilities

Nowadays many companies are making a considerable effort to guarantee the constant sharing of information, knowledge and expertise across their organizations. It comes as no surprise that Information Technology plays a key role in responding to this challenge.

A very popular IT solution for organizing enterprise information is the use of company portals. From an IT professional perspective, portals combine enterprise data and applications that reside on the company intranet into a tight infrastructure. From a business perspective, portals provide users with easy access to data and applications through personalized views.

Business Intelligence is also an important part of this discussion for companies that want to use their intranet portal to deliver reports and dashboards to end users. For example, one of our customers has recently asked us to investigate the integration capabilities of Business Objects XI R3 with the leading portal-solution on the market, Microsoft SharePoint.

In this post I will introduce the topic of integration capabilities and benefits. Due to its complexity, this topic also promises to be an interesting focus for future blog articles and continuous technical investigation.

Why Integrate BusinessObjects with SharePoint?

Integrating BusinessObjects with SharePoint allows users to view and manage information within a Single Portal Interface. This integration grants access to all business intelligence resources such as interactive Xcelsius Dashboards, Crystal Reports, Web Intelligence, and any Microsoft Office documents that are stored in the CMS Repository. Therefore, users can navigate BusinessObjects personal/favorites folders, public folders, inbox, as well as personal categories and corporate categories within SharePoint.

The clear advantages for business users are:

  • A unique and easy point of access to any kind of BI content (but not exclusively  BI), compared to InfoView, which is a powerful tool but still represents an additional and restricted interface.
  • A simplified and more efficient use of software licenses, eliminating redundant licenses for those users that are typically viewers of BI content with no need to edit reports. In this case SharePoint allows them to view and refresh Crystal or WebI reports from within the portal.

From a System Administrator point of view, integration with SharePoint offers the possibility of reducing duplication in the backend infrastructure and consolidating BI tools with other technologies in one enterprise portal with a single security model.

While the benefits of such an implementation are clear, I found that there is still some uncertainty surrounding the available options for deployment. Taking into account new and old releases, several different versions of BusinessObjects and SharePoint are being used within organizations today. In the past few years SAP has released multiple versions of the “SharePoint Portal Integration Kit” for BusinessObjects, and both XI R2 and XI R3 can be matched with SharePoint 2003 or 2007.

Both BusinessObjects Enterprise and Edge customers are entitled to download the integration kit from the SAP developers’ portal. Moreover, although it’s not specifically advertised by SAP, there is evidence of successful implementations of BusinessObjects XI 3.1 with Java InfoView interfaces on www.forumtopics.com/busobj. This suggests that even companies that are not using a .NET interface will be able to exploit the integration capabilities of BusinessObjects with the portal.

However, a .NET interface seems to be the ideal condition for achieving full integration with the Microsoft platform. In fact, the integration kits for Microsoft SharePoint 2007 and Microsoft SharePoint 2003 have gained acceptance in the marketplace but have limited capabilities. As a result, SAP recently released an additional product, with the intention of providing all of the .NET InfoView capabilities from within the SharePoint interface. The product has been released with the name of "Integration Option for Microsoft SharePoint software 1.0".

The following table, from an official SAP source, clearly shows how superior this option is compared to the previous integration kits:

Further information on the available SharePoint integration kits can be found on www.sdn.sap.com. Integrations kits are free of charge and all direct zip file downloads are available here.

On a final note, it is worth speculating on what is to come in the near future. Regrettably, the current versions of the PIK and the IOMS do not provide an option for integration with SharePoint 2007 64bit. All existing products integrate with 32bit versions of SharePoint. But not to worry; with the recent launch of SharePoint 2010, available in 64bit mode only, SAP is catching up with an ad hoc version of the PIK. According to rumors on the SAP developers’ forums, the 64bit integration kit will be released by mid 2011, following the launch of BusinessObjects XI R4 later this year.

Will this be the chance to improve the integration capabilities with Java InfoView interfaces… We can’t say just yet. But stay tuned for updates on this topic and for more technical insights in future blog articles.

Doing BI Right: Why you need a proper business intelligence methodology

We have often faced situations where people thought that the mere fact of having a Business Intelligence tool and qualified consultants were enough to guarantee a successful BI implementation. However without a good BI methodology it can be difficult to meet deadlines, satisfy all the users and stay within budget, even with the best of intentions.

In this post I will briefly talk about some aspects of a methodology that we have implemented with some of our customers.

Distinction between power and normal business users

Accuracy of the reports is key to people who want to ensure that they are making the right decisions. Therefore it is important to always have a group of power users who can work on difficult reports and are able to understand how the data is modeled. These people usually work on predefined reports as well as difficult ad-hoc analysis.

Normal business users usually work on personal and ad-hoc reporting. They want to get their questions answered very quickly, but for that they need to have very good and simple universes. For example, most of these types of users are not comfortable working with universes that have contexts.

Implementation of a good business gathering scenario

From our experience, gathering business requirements properly leads to the correct delivery of complex analysis to the business.  We have had the best results when the requirements gathering process has been:

  • centralized: the business should always think of a single point of access for business requirements gathering. If this is not centralized, the process can be hard to define.
  • recurring: it should also recur regularly as a proper business gathering process is never finished. We have usually set recurring meetings (weekly, twice per week) where some people from the reporting team meet their business sponsors and agree on the next actions to take.

Implementation of a good lifecycle and version control tool

When working with large enterprise customers (with many developers) it is always good practice to implement a version control tool as well as a workflow in order to promote content from development environments to production.

With version control tools the developers can share, lock and control their versions so everything is kept under control. This is especially important in large environments.

It is also important to have a criteria list of points that the reports should meet before they are promoted to production. This way, we make sure that whatever is in production has been properly tested and confirmed (the criteria can refer to query performance, layout format, etc.)

There are many third party applications that offer the version control as well as the lifecycle management functionalities.

Distinction between personal and public areas

BusinessObjects already makes the distinction between personal and public folders.  This point goes together with the previous point. We have always implemented the lifecycle processes under the public area so this basically becomes a read only area in production.

By doing this we achieve the following:

  • Users can be confident about all that is under the public folders as that content meet the proper criteria before it has been promoted to production
  • Public folders are cleaned
  • Public folders are tidy

If you are about to undertake a new BI project, especially one in a large customer environment, I hope the tips above will be useful to your team as you build your own best practice BI methodology. If you have any ideas to add or any feedback about my suggestions, please feel free to leave a comment below.

Improving the Performance of Xcelsius files with OpenOffice

During the past few months my coworkers and I have been working with Xcelsius on a regular basis to develop dashboard for our customers. Sometimes we face challenges when we generate a swf file from an xlf file in Xcelsius and we don’t know why. Other times, Xcelsius crashes during the generation of the swf file. Even when the swf generates correctly, we occasionally see dashboard performance issues (i.e. when we execute the Xcelsius swf file, the load time is very slow).  However, we have found a trick that can be used to resolve these issues.

In this post I will explain the results of two tests that we did to reduce xlf file sizes followed by the steps you can follow to achieve these results.

The main idea of this trick is to reduce the size of the xlf file using OpenOffice software. Let me start by showing you the test results:

For the purpose of this test, we created an Xcelsius file called TestCompression.xlf

First we exported the xls file from the Xcelsius file by selecting Data -> Export:

We then saved the xls file. As you can see in the screenshot below, this generated a 2,264 KB xls file, so our objective was to decrease this file size.

Next we opened the xls file with Microsoft Excel and without modifying anything we saved it using another name. We repeated the same steps but this time with OpenOffice. In the image below you can see the decrease in size of the xls file. The size difference between the original xls file and the OpenOffice xls file is quite significant.

Finally we imported the new xls file into Xcelsius by selecting Data -> Import

In the screenshot below you can see that we decreased the xlf file size using the OpenOffice xls, but the change wasn’t very significant. TestCompression-OpenOffice.xlf is 1,117 KB, compared to the original TestCompression.xlf which was 1,236 KB.

As a result, we decided to test with another xlf file, which included hard coded data, to see if the compression would be more significant. For the second test, we achieved the following results after completing the same steps as outlined above.

In this screenshot we can see a significant decrease in the file size of the OpenOffice xlf with hard coded data. The original file TestCompression2.xlf file was 1,241 KB and the final TestCompression2-OpenOffice.xlf file was less than half the size (577 KB).

As a result of these two tests, we observed the following:

  • Each time we modify an Excel Sheet inside Xcelsius, the size of the xls file increases.
  • When the original xls is very large, the decrease in size is more substantial when we use OpenOffice.
  • If we have hard coded data in the Excel file, we notice a greater size decrease than if we have QaaWs (Query as a Web Service) or Live Office Connections in the Excel sheet.

From now on, each time we attempt to generate a swf and we have made modifications (to data or Excel formulas) inside the Xcelsius Excel Sheet, we follow these best practice steps:

  1. Export from Xcelisus to xls file
  2. Open xls with OpenOffice
  3. Save it as xls file with new name
  4. Import to Xcelsius

In terms of speed, we notice changes in the swf loading process especially if most of our data is hard coded.

Finally find below a summary of the results obtained:

 

If you have experienced a similar situation with your Xcelsius files, I would be interested to hear how you have managed to reduce the file size. Also if you have any suggestions or feedback about my methods in this post, feel free to leave a comment below.

Weekly Reports: An inside look at the week format results in Oracle Database and SQL Server

Weekly reports are usually requested by customers who want to follow their activity results by weeks running from Monday to Sunday. The most common way to collect weekly data is by grouping date ranges by their Week No. of the Year. 

As you will see in this post, when I started  investigating this topic I found some interesting information about the week format in both Oracle Database and SQL Server, which I hope will be useful to others using these tools.

Oracle Database

Throughout my many years working with Oracle I assumed that the ‘ww’ mask returns the Week No. of Year according to the standard week (running from Monday to Sunday). After doing some queries I was surprised to discover that days belonging to the same week can actually have a different Week No depending on the day of the week that the year started.

For example, Week 1 of 2010 started on a Friday, therefore every Week No in 2010 will run from Friday to Thursday:

After some research I found the following documentation for Oracle DB that provides an additional explanation on this subject:

http://download.oracle.com/docs/cd/B10500_01/server.920/a96529/ch7.htm#5186

By applying this new knowledge to the previous query I was able to compare the two methods:

SQL Server

For those of you who are SQL Server programmers, I have also done some investigation on this subject. SQL Server 2008 supports ISO week as an argument of its DATEPART function. Prior versions use the regular ‘ww’ or ‘wk’ mask based on January 1st. The first day of a week is defined in the DATEFIRST session setting which by default sets Sunday to be the first day of the week.

You can use this user defined function for calculating the ISO week number in prior versions of SQL Server 2008 as follows:

After implementing the function above, you can run the following query to compare the common week mask and ISO week:

Notice that Friday is numbered as the 6th day of the week. The first day of a week depends on the DATEFIRST session parameter, which by default is set to 7 (weeks start on Sunday and end on Saturday).

  • To see the current setting of DATEFIRST, use the @@DATEFIRST function.
  • The setting of SET DATEFIRST is set at execute or run time and not at parse time.
  • To set Monday as the first day of the week execute:

To finish off, I would like to list some advantages and disadvantages about using ISO week numbering:

Advantages:

  • All weeks have an integral number of days (i.e. there are no partial weeks).
  • All years have an integral number of weeks.
  • The date directly tells the weekday.
  • All weeks of the year start on a Monday and end on a Sunday.
  • When used by itself without using the concept of month, all the weeks in a year are the same except that some years have week 53 at the end.
  • The weeks are the same as used with the Gregorian calendar. Dates represented as yyyy-Www-d or yyyyWwwd can be sorted as strings.

Disadvantages:

  • Not all parts of the world have a work week that begins with Monday. For example, in some countries, the work week may begin on Saturday or Sunday.
  • In the link below you can find an extended list of countries and their week numbering rules: http://www.pjh2.de/datetime/weeknumber/wnc.php?l=en
  • Many organizations use their own calendars which do not follow ISO standard (i.e. Fiscal calendars or academic calendars).

In summary, things are not always as they seem. As my professors always told me, in the business of computer science we can never afford to make assumptions and should always read the documentation and check sources.

I would be interested to know if anyone else has additional information or comments to share on this topic. Please feel free to leave a comment below.

Real Time Dashboards – Lessons Learned

There are some scenarios in which a fast pace is required for data monitoring. This is where real time dashboards come in. A real time dashboard can display information with a high degree of frequency. This information is used by the audience to make quick decisions to correct the behavior of certain foreseeable trends. Real time dashboards can also set off alerts based on business thresholds allowing the audience to react quickly based on up-to-the-minute information.

Real time business intelligence is relevant to multiple industries and departments. Dashboards can be used to monitor call centre performance, IT system usage, data security, dynamic pricing, and inventory optimization and even support fraud detection and risk management efforts. Based on our experience building a real time dashboard to monitor call center data, this article will highlight tips and lessons learned for anyone who is undertaking a similar project.

ETL

Scheduling Frequency In reality the dashboard is not real time. It is a batch job with a very short refresh/update frequency. In our case it was 1 minute, which is also a limitation on Data Services batch jobs. It is important to know these limitations and also to know how fast your entire ETL batch job can run.

If batch job total duration > batch job run frequency then you will create overlapping processes which can immediately cause two issues:

  1. Overload your server or generate refresh slow downs

  2. Create inconsistencies in your data if your ETL does not correctly do blocks and queues write process to DB tables.

Run Optimization Given the need presented above: batch job total duration < batch job run frequency, you must search for the optimal setting to run your batch job. In Data Services there are some situations that can easily speed up your job execution but there is a delicate balance you must consider. One example is the memory method it uses to run the batch job, which can be selected from:

  • Pageable

  • In Memory

Pageable

Also whenever you have a table comparison process, the performance can be optimized by running the comparison as a separate process.

Table comparison

The In method runs faster when your server has enough RAM resources, however if your server does not have enough free RAM it will overload and will not be able to catch up with new spawning processes, running lower and lower in physical memory until it causes a complete  server crash.

Tomcat

Memory management Tomcat, under determined circumstances, does not perform well with memory management and garbage collection. When executing several QAAWS every minute the memory it uses can build up very quickly. Any Tomcat service on a 32bit Windows environment will have a limitation of 1.2 GB of memory to allocate. Tomcat becomes unstable when it reaches that limit and new requests are still coming in at a constant rate.

There are several tweaks on Tomcat and JVM memory  that can be done to optimize this.

One example of these tweaks is the memory limits that can be set when Tomcat is started; these can be set using windows registry or run time modificators.

Cache Optim flag The QAAWS application that comes bundled with Business Objects has a limitation set to cache data requests. When the same request is done in short periods of time the cache data is sent instead of running the query again. To avoid this and get new data every time, you need to disable this functionality on the dsws application properties:

dswsproperties

To disable it, you need to set the qaaws.cache_dpresult.optim flag to false.

qaaws cache

Restart Script In order to keep Tomcat service from memory overloads it is a good practice to schedule an overnight restart that will force garbage collection. The script can be very basic or contain additional cleanup tasks.

Script

HTML Container Placing the flash file on  html container will allow you to execute some actions prior and during the flash file execution. You can run javascripts, pass mandatory flash variables (i.e. suppressing tomcat error messages when running qaaws), etc.

The most basic html container for an xcelsius flash file will look as follows:

Xcelsius flash container

Auto refresh Java script As mentioned before, an HTML container will allow you to run javascript programs on the Explorer window executing your xcelsius flash file. Many applications can be given to javascript but one of them could be the browser page auto-refresh function. With the refresh you can wipe out outdated errors from screen.

No Error flash Variable The no error flash variable is a new functionality on Xcelisus SP4. It allows you to set the pop-up error windows off in cases where the error is not directly related to Xcelsius logic, i.e. Tomcat outages, scheduled system restarts, etc.

To add this functionality to any Xcelsius dashboard you need to add the flash variable input to your xlf file first.

Flash Variables

Finally you also you need to pass the variable to the SWF as indicated in the HTML container code example above.

I hope these lessons learned are helpful to anyone working on a real time dashboard. Feel free to leave a comment or question below if you need more information on any of the tips provided in this post.

Working with multi-sections in SAP BusinessObjects Query & Analysis tools

Sections are traditional features in the SAP BusinessObjects Query & Analysis (QA) tools. They provide a list of items and their associated partial report, including table(s), graph(s) or both. This is very powerful, as you may imagine. As an example, a mobile phone bill is created from sections by subscriber, associating every subscriber with his/her own bill.

This blog article will describe a particular requirement together with a challenge that we faced in a real scenario and how it was resolved.

The initial requirement

In this scenario we were working with a telco company that had a systems containing its billing data. From this the requirement was to develop a table with every subscriber’s call details, as well as a graph with the last 12 months total consumption. Moreover, there were three other conditions:

  • The subscribers are often grouped by company, so the total bill by company must be generated – same report –
  • The resulting pdf must show Company / Subscribers, in this order, and allow users to navigate across the pdf
  • At the request of the customer, the report must be done in the traditional Desktop Intelligence tool

The challenge

Starting a mockup with WebI, we built the query, dragged & dropped 2 sections (1 by company, 1 by subscriber), drafted the table and the graph, adjusted the width of sections manually so each one fell into one sheet, refreshed, exported to pdf and “voilà”, the draft seemed to accomplish our specifications.

PDF document saved from WebI
PDF document saved from WebI

Pdf document saved from WebI with navigation to sections and subsections (each one in a different page)

When we tried with DeskI, we followed the same steps; setting the “start on a new page” and the “avoid break page in block” to active, but we still ran into the issue of a blank page. This was due to the fact that between the section and the first item of the second subsection, the "start on a new page" was not applied automatically, so we had to create it ourselves. As our bill layout was quite large (and took up the whole A4 page), the report ran onto the following page (that is to say, the end of the section), so an empty page was generated just after.

Manual editing in DeskI
Manual editing in DeskI

In DeskI the distance between the section and the 1st item of the 2nd subsection needs to be adjusted manually

Our solution:

The solution in DeskI was based on incremental actions. We have highlighted the advantages(+) and disadvantages (-) to give more insight into our situation:

1. Remove the “Start on a new page” option for the subsection (+) The blank page disappears (-) 1st sub-item navigation issue: The 2 lines from section / subsection appear in different pages, so PDF links do not work properly, as the button from 1st item in subsection stays in section page

2. Add a cell at the end of the first section table and graph (+) The 2 lines from section /subsection always appear in the same page, so PDF links work properly (-) In the specific case that a company does not have a subscriber, there is an empty page between companies, as 1st section ends up in the next page

3. Hide the auxiliary cell in cases when a company does not have a subscriber (+) Create a condition based on a count of subscribers inside the company, and hide the cells if that count is zero (-) None Conclusion

What we learned from this scenario is that traditional tools like Desktop Intelligence really do provide great flexibility but you have to have experts on hand who constantly push them to the limit. On the other hand, when possible we should tap into the power of Web Intelligence, which combines the Query & Analysis strength of its predecessor together with a functional capability tailored to the current web based world needs.

Feedback and questions are always welcome. If you have similar challenges, we would be happy to share our insight.

Connecting SAP BW and BusinessObjects Data Integrator for data extraction

In follow up to my blog article on July 7, I would like to share some insight for connecting SAP BW and SAP BusinessObjects Data Integrator for the purposes of data extraction.   

 The problem that I encountered was that I could not connect my BODS to SAP BW. The connection was correctly created into the Management console of Data Integrator but the start up was always failing. 

After what seemed like hundreds of tests and commands from the cmd, I found the solution: the services file contained in the same path as the host file (windows/system32/drivers/etc) requires a small change:

  1. You need to add the following string SAPGWxx  33xx/tcp where xx is the system number of your SAP connection.
  2. Then here I also configured the sapjco3.jar that is stored in tomcat (you will find it with an easy search in the folder) in the Tomcat CLASSPATH as per the previous topic posted on July 7.

To start the service I used a command from the cmd in the location: Drive:Business ObjectsBusiness Objects Data Servicesbin: RfcSvr -aRFC_ProgramID -g/H/ip or name of the SAP Application Server name/S/33xx -xsapgwxx   RfcSvr is the .exe file that starts the DI processes. If you want to know more details regarding this command, the best way is to do a quick search in Google.

After following the steps above, everything should work fine. At this point, you can use BW cubes as your data source in Data Integrator.

If you have any questions or feedback to add to this quick solution, please feel free to leave a comment below.

Connection between SAP BW and Universe Designer / BusinessObjects InfoView

In this blog post I will explain some tips that I learned while integrating SAP BusinessObjects Universe Designer/Infoview with SAP BW.

For the most part, the steps should be simple and quite standard (unless you face some unexpected issues).   First of all you need to install the platform and the integration kit for SAP. At the end of this process you will see that you can create universes on top of BW cubes or BW queries. You can easily publish the universe and retrieve your data in a report.

Now in theory, after configuring the user in the CMC (BO CMC --> Authentications --> SAP) a user should be able to log in to SAP BusinessObjects InfoView using his/her SAP credentials…

But in reality BusinessObjects will fail while importing the roles of the SAP user. Why? Because you will need a java connector, which doesn't come with the “out of the box” integration kit.

All you need to do is download the files from SAP (or from the bottom of this blog post), and make them available for your system.   Here a step by step guide:

  1. Create the folders in your tomcat55 parent folder called Sharedlib
  2. Copy there the sapjco.jar and the .dll files
  3. Copy the .dll files into Windows/system32
  4. Go to Tomcat configuration and add the complete path of the file sapjco.jar to the CLASSPATH string - restart tomcat

Notes: Do not confuse the sapjco.jar with other versions in the folder. Download the 32bit version even if your machine is a 64bit.   Now try to import your user's role in BO CMC --> Authentications --> SAP again and you will see that everything will work as expected.

If you have any questions or feedback about this solution, please let me know by leaving a comment below.

> Download this Shared file (contains sapjco.jar and the .dll files)

SAP Xcelsius Challenge for Community Poll Results – Best Practices

I entered the SAP Xcelsius Challenge for Community Poll Results to flex my Xcelsius skills and also offer some fresh options for dashboarding to the SAP SDN community. I’m pleased to report that my submission received an honorable mention in the competition. Thanks to everyone who voted!

In this post I will explain the tools, goals, content and structure that I decided to work with for this challenge and my best-practice steps for developing an Xcelsius dashboard.

1. Tool selection

I started by defining the high level goal, and selecting the tool that fits the best. SAP Crystal Dashboard Design (formerly known as Xcelsius) is a great tool and platform – obviously for dashboarding – but it’s not for all purposes. SAP BusinessObjects offers a wide range of tools that can  solve specific scenarios; these are well integrated and often a combination of two or more is the best solution.

2. Purpose & Detailed Goals

Why am I building a dashboard? What insight am I interested in?

Besides displaying the poll results, my goal was to allow performance analysis of continents, regions and countries participating in the Winter Olympics, with a focus on the games in Vancouver.

In my opinion, to ask and answer the above questions is very important and often overlooked in favour of defining KPIs first. Without setting a clear goal, I wouldn’t have been able to decide which KPI was more or less useful than any others. With that defined I proceeded to answer the next questions.

3.  Content

How can I achieve the above goal? What are the best performance indicators? How can I display these KPIs?

Possible KPIs

  • Number of medals

  • Number of gold medals

  • Weighted number of medals (gold * 3 + silver * 2 + bronze)

  • Number of medals / population

  • Number of medals won by country / Total number of medals

Possible display options for KPIs

  • Trends – Which indicator’s trend would be best to see?

  • Comparisons – What would be the dimensions to compare? 

I chose the most common KPI: Number of medals. But as you will see, this single KPI can be displayed in a number of ways. A less traditional way – in terms of the Olympics – is grouping by geographical dimension. This gives a unique view, not to mention that it allows me to showcase my DrillChart add-on.

4. Structure

How can I best organize my content? 

At this point, I decided to summarize what information I had and try to find a place for my content on the screen:

  • The mandatory poll results – vertical bar chart.

  • Number of medals by region – horizontal bar chart

The Poll chart has all the sports listed, so it gives an opportunity to use it as a selector too. I thought it would be good to connect these two charts and allow the user to analyze the number of medals by sports as well. Although it might not be as clear as a horizontal navigation bar spanning the header of the dashboard, I opted to use it – with clear caption –  to save some screen real estate.

The order of selection would be: Sport → Region, so I put the Poll Results to the top left side, where the viewer generally starts scanning the screen. I used the two colors of the Vancouver games – green and blue – to make a clear distinction between the poll and the medal analysis.

Olympics Dashboard 1

The two main charts consume about half of the screen, and I still had a lot to show:

  • Trend lines

  • Distribution of medals – Gold, Silver, Bronze

  • Historical aggregations

  • Comparison to the previous game

All the above information is dependent on the user selections (Sport – Continent / Region) and gives more insight into the data. I used micro-charts under the main medal chart to show the details.

Olympics Dashboard 2

5. Implementation

The final part is the actual development. Luckily this is straightforward – and much faster – when the functional and layout design are well defined, although this is always an iterative process with some modifications.  

I was happy to practice a little bit on this example, and would be happy to hear your feedback! And if you are looking for some creative dashboarding expertise, Clariba has a wealth of experience and very talented consultants in this area. Feel free to contact us at info@clariba.com.