Well, I recently got together with a friend of mine, and one of the topics turned to SAP HANA. Well, up to now, my perception of HANA was just a lot of hype. It seemed like it was just the latest buzzword that SAP invented, and if you keep talking about something enough, people will buy it. While this approach hasn’t yet worked for me, maybe I need to try it 🙂
Up to now I only knew that HANA was built around RAM vs. HDD space. So for everything that I’ve heard, you buy a bunch of new hardware, then everything ran faster. Well, after talking to my friend, I feel like I was right and wrong, at the same time 🙂 So, from what learned, HANA is basically going to become a replacement for the database that SAP runs on. So instead of using Oracle, soon you’ll be using HANA. The concept with this is that the table structures, views, etc. have been changed to optimize the structures within the database. Since the hardware is enhanced to handle to operate with more memory, the need to have simplified structures or even views is gone, so instead the DB can handle bigger tables.
While this may be true, it still seems to me that this is technology that is geared toward new customers or installations. It seems like this is unlikely to be embraced by existing installations, at least until they are mandated to. Of course, I’d love to hear your opinions on this.
Just in case you’re interested in another funny opinion, check out this blog:
http://scn.sap.com/community/hana-in-memory/blog/2014/09/10/y-u-no-love-hana
thanks for reading,
As always, thanks for reading and don't forget to check out our SAP Service Management Products at my other company JaveLLin Solutions,Mike
Totally agree on the mandate principle. Hardware costs are only going up – most companies just upgraded to virtualization less then 5 years ago and put in a SAN for their disk storage…..they need to wait for that capital to depreciate out before moving on. My guess is ECC 6.0 EhP 7 or 8 will be the last one offer without HANA though.. Based on their literature all over SAP’s Palo Alto office when I was there in April, my bet is that’s the direction they’ll push when they force out the next series of Netweaver upgrades and just make it mandatory to purchase support.
My gut tells me there just isn’t that many processes that are both *big* data and *fast* data. Think about it from a service perspective – the only processes that could be that fast and big are call center processing. From a call center – you could use the technology to forecast system outages for utility providers based on call density and volumes for a particular area and route field service techs in advance. Problem is that utilities already do this in advance of severe weather and have been doing so without HANA for decades….but I guess if my power comes on after a thunderstorm a few hours faster, that would be nice.
I totally buy the need for the analytics for retail as well as utilities, but for most manufacturing organizations they don’t want (or need) to act faster then daily. Rule of 7 applies – 7 hours, 7 days, 7 weeks. That’s what their forecasting should be focused on. Any more then that then the great MRP “beer game” will quickly take hold as second guessing drives decisions.
Only exception would be those industries that are tied to commodities trading – take electrical or fiber optic cable manufacturing for instance……if the price of the raw materials goes up a penny a pound, you can go from profit to loss in seconds. Having the power to take massive amounts of data and build forecasting models faster then someone else is power……SAP has a goldmine with HANA – they shouldn’t be pushing as part of an ERP system though.