Using the NetApp Unified Manager REST API to send metrics to Elastic Stack

Matt Houghton
4 min readOct 19, 2018

We use a number of NetApp storage appliances on which we run our Oracle database estate. We use SMO to provide quick clones of databases based on snapshots at the storage layer. We are also big users and fans of the Elastic Stack and use it for all our logging and metrics.

NetApp provide their Unified Manager software which provides us with lots of metrics and graphs, but for us it’s much better to have all our logging and metrics in one place in Elastic. The ability for us to have an overall picture of what’s happening in our entire architecture and use X-Pack features such as alerting and machine learning to warn us of abnormalities that we can resolve before they become issues for our customers is powerful.

This post takes you through how to use the NetApp Unified Manager API’s to capture detailed metrics for your NetApp appliances and send this data to the Elastic Stack.

I’m using Python to call the API’s with the following packages.

NetApp Documentation

The NetApp documentation is available via the Unified Manager console. Click the question mark in the top right corner and you will see a link to API Documentation.

Authentication

In order to access the API’s you need to authenticate, our awesome storage guys had already tied authentication to Active Directory so i had no work to do :-)

API Access

In order to get to the low level metrics i just called the API’s one after the other. During development I found that an outer JSON blob would be returned and that would contain an inner JSON blob that had the values i was interested in.

The code below shows how this works in principle.

Replace ‘api-name’ and ‘namespace-value’ (the name of the inner blob) as shown in the list below.

'aggregates','netapp:aggregateInventoryList''aggregates/capacity-utilization','netapp:aggregateCapacityAndUtilizationList'

'clusters', 'netapp:clusterInventoryList'
'clusters/storage-summary', 'netapp:storageSummaryList'

'events', 'netapp:eventDtoList'

'lifs', 'netapp:lifInventoryList'

'luns', 'netapp:lunInventoryList'

'nodes', 'netapp:nodeInventoryList'

'ports', 'netapp:portInventoryList'

'svms', 'netapp:svmInventoryList'

'volumes','netapp:volumeInventoryList'
'volumes/capacity-utilization','netapp:volumeCapacityAndUtilizationList''volumes/relationships-transfer-rate','netapp:volumeTransferRateList''volumes/relationships-transfer-status','netapp:volumeTransferStatusList'

Note there maybe a few API’s that i’ve not captured in the list above as they were for features we were not utilising no data was returned when i called them.

If you want to debug this yourself then have a look at what is brought back into the variable dictdump. It will contain another dictionary ‘_embedded’ and then a list which starts with the value ‘netapp:’. It is the list that contains the items (JSON payloads as more dictionaries) with the metrics in that you will want to iterate over and push to Elastic.

A couple of other things to point out. The header in the request i got from the documentation and it ensures the response is in JSON. When I first called the API using CURL by default i got a CSV response. The last thing is the limit parameter, i increased this to suit my needs as by default the API responds with quite a small payload.

Elastic Stack Setup

I added an index for the data to be logged into. Using some of the data type detection capabilities of Elasticsearch worked well enough for my needs to get started with but i will be doing some refinement overtime.

Indexing The Data

Next i added a function to take the JSON payload from the code above and index it into Elasticsearch.

Example Data

Here is a sample document from Elasticsearch.

--

--

Matt Houghton

Data Architect @CDL_Software , AWS Community Builder, 13 x AWS Certified. Qlik Global Luminary 50.