My experience

What I like about Splunk annual conference is that it’s not only a great opportunity to learn more about the technology I use on day-to-day basis and hear how others use it, and learn about new features but also an opportunity to visit new places. This year’s conference took place in Las Vegas, Nevada, which I visited for the first time!

This is a third Splunk conference I have attended with my colleagues in iDelta. Perhaps the first thing that caught my attention this year was the hoodie wall. I can proudly say that have 3 of the hoodies on this wall.

My .conf highlight: source=*Pavilion

This year I decided to spend more time at the Pavilion meeting other splunkers and spending time learning more about Splunk premium products such as Business Flow and Splunk Investigate.

I also found out that I am great at pinball. Almost beat the record! Not bad for the first time.

Overall, I found the Pavilion really great this year, combining perfect amount of fun and information, and leaving you inspired and energised.

My favorite session at .conf19

Having worked in Financial Services for the last 3 years, I was very interested in attending Duncan Ash’s and Haider Al-Seaidy’s session “40 Ways to Use Splunk in Financial Services” and to learn

Splunk customers around the world are using Splunk in creative ways to solve a range of challenges, from Trading Strategies to Market Abuse, and from Customer On-boarding to Customer Churn.

In particularly I found interesting:

  • Discussion around trading operations and importance of taking into consideration thousands of moving parts along with many processes, systems and external sources of data. The challenge here is that if something blinks for even one millisecond and causes an issue, it is difficult to notice without having proper tools in place to inform you of such outages. On top of that there are pressures on Financial Services Organisations, for instance, from regulators who may require reports on such outages.

    With Splunk we can ingest data from all of the components and monitor and alert on these in real time. What I found particularly interesting is that Data Stream Processor (DSP) was used in the demo to show how high-volume data can be continuously collected. One of the key benefits in using DSP is that it gives us high capability to Splunk data and perform actions in a small period of time.

  • Presentation of ATM solution and challenges faced with securing ATMs and how Splunk can help. Emphasis here is on re-using the data to showcase the value across various teams. Indeed something that we always try to practice when delivering customer projects. The idea is to ingest data once and to re-use it across various use cases.

Same Data, Multiple Times the Value

There are a lot of data sources available on the ATMs that would back use cases such as security, operations, fraud, customer experience and cash management. In this example universal forwarder (UF) was used to ingest the data. Some of the key benefits in using the UF is that it can run on both Windows and Linux, it can buffer the data, centralise de-centralised data for the purpose of analytics, as well as being able to throttle bandwidth used.

https://conf.splunk.com/files/2019/recordings/BA1321.mp4

During the talk several dashboards were shown to explore some of the use cases mentioned in the talk. For instance dashboard that allows us to analyse health of the ATMs by looking at the number of incidents recorded, which are then further broken down by ATM model number, geographic location, network allowing us to drill down and identify the exact ATM that may have issues.

Another example dashboard was around ATM usage, in this case we are not looking at an individual ATM but instead doing analytics to compare one ATM to another.

The example dashboards showcased how to obtain more value from Splunk by re-using the data from various sources to provide analytics that are relevant across the teams in an organisation.

Summary

The main reason I chose this talk is because it closely aligns with the type of work I do on day-to-day basis. I see a lot of value in learning how other Financial Services organisations, splunkers and Splunk partners tackle some of the very complex use cases in the Financial Services domain. I believe that this kind of conversation and knowledge exchange is a great way to improve my skills.

Posted by:Aina Puncule

Aina Puncule has been working as a Technical Consultant at iDelta for 3 years. She is skilled in the elicitation of requirements, design, implementation and delivery of technical solutions. Aina has great communication skills and loves working with customers. She currently holds Splunk Certified Consultant certification and First Class Degree in BSC (Hons) Business Information Systems.