Skip to content
This repository has been archived by the owner on Mar 22, 2018. It is now read-only.

undefined method `[]' for nil:NilClass #1

Open
brian-olson opened this issue Nov 7, 2014 · 11 comments
Open

undefined method `[]' for nil:NilClass #1

brian-olson opened this issue Nov 7, 2014 · 11 comments

Comments

@brian-olson
Copy link

I'm having some issues getting the collector to run successfully. It looks like the code is being executed successfully because I can see that it establishes the logstash connection and opens the listening port via netstat. When running tcpdump on the second terminal I can also see that no packets are being sent from the sflow listener to logstash. The error below repeats indefinitely and the speed seems to be related to the expected flows, leading me to believe it is successfully receiving the packets, but just unable to process them successfully. I've gone back and ensured that eventmachine, yaml, and bindata are installed. Any other thoughts on what the issue may be? Thanks!

OS: CentOS 6.5
Ruby: ruby 2.1.4p265 (2014-10-27 revision 48166) [x86_64-linux]
Logstash: logstash-1.4.2

TERMINAL 1
[root@HOST sflow]# bundle exec ./bin/sflow.rb
Connecting to Logstash: localhost:6543
Server listening.
2014-11-07 09:50:23 -0500
nil
undefined method []' for nil:NilClass /DIR/logstash-1.4.2/sflow/lib/sflow/parsers/parsers.rb:7:inparse_packet'
/DIR/logstash-1.4.2/sflow/lib/sflow/collector.rb:13:in block in receive_data' /usr/local/lib/ruby/gems/2.1.0/gems/eventmachine-1.0.3/lib/eventmachine.rb:1037:incall'
/usr/local/lib/ruby/gems/2.1.0/gems/eventmachine-1.0.3/lib/eventmachine.rb:1037:in `block in spawn_threadpool'
2014-11-07 09:50:23 -0500

TERMINAL 2
[root@HOST sflow]# netstat -anup
Active Internet connections (servers and established)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
udp 0 0 0.0.0.0:6543 0.0.0.0:* 32242/java
udp 0 0 127.0.0.1:57626 127.0.0.1:6543 ESTABLISHED 493/ruby
udp 0 0 0.0.0.0:6343 0.0.0.0:* 493/ruby

@blook
Copy link
Contributor

blook commented Nov 7, 2014

Hi caine256,
first of all, thank you for trying the sflow parser and your reported issue as well. I guess its looking for the name of the switch, which is sending the sample.

In config.yaml you are able to set a proper hostname to the switch ip address.

switch:
1.2.3.4: "myswitch_hostname"
1.2.3.5: "my2ndswitch_hostname"

Please try setting correct ips and switch hostnames of all your switches you are collecting sflow samples from.

If this fixes the issue for you I should fix the code :)

Regards,

Sebastian

@b0e
Copy link

b0e commented Nov 26, 2014

Hey @caine256,

does it work if you specifiy your switch names?

Regards,
b0e

@TheSeraph
Copy link

I'm also having problems with this on an ubuntu system

HAHAHAAH@MyServer:~/sflow$ bundle exec ./bin/sflow.rb
Connecting to Logstash: localhost:6543
Getting switch interface names 2015-03-09 13:18:30 -0400
no name for 192.168.1.8
/usr/lib/ruby/1.9.1/resolv.rb:128:in getname' /home/HAHAHAAH/sflow/lib/sflow/snmp/iface_names.rb:8:inblock in initialize'
/home/HAHAHAAH/sflow/lib/sflow/snmp/iface_names.rb:7:in each_key' /home/HAHAHAAH/sflow/lib/sflow/snmp/iface_names.rb:7:ineach'
/home/HAHAHAAH/sflow/lib/sflow/snmp/iface_names.rb:7:in initialize' /home/HAHAHAAH/sflow/lib/sflow/collector.rb:54:innew'
/home/HAHAHAAH/sflow/lib/sflow/collector.rb:54:in start_collector' ./bin/sflow.rb:7:in

'
/home/HAHAHAAH/sflow/lib/sflow/collector.rb:63:in rescue in start_collector': unable to start sflow collector (RuntimeError) from /home/HAHAHAAH/sflow/lib/sflow/collector.rb:42:instart_collector'
from ./bin/sflow.rb:7:in `'

However, I did configure a switch with the right IP and the right name. I also tried it FQDN

@etfeet
Copy link

etfeet commented Apr 17, 2015

I'm also having trouble getting the sflow collector.

I'm receiving data from the collector but only the name of the host the collector is running on:
type, host, _type, _index, _id, @Version, @timestamp, ,sflow_agent_address

sflow 127.0.0.1 sflow sflow-2015.04.17 AUzEsWZnhNNs4DpbGImT 1 2015-04-17T00:05:02.262Z 0 KM_C5G

the collector is recieving data from the switch however its getting mangled in transit to logstash. ALso the console for the collector looks like its not able to process the sflow data its receiving. Im using snmp v3 to get the interface names - could that be related?

When i receive sflow traffic the switch the collector shows the following:

root@logstash:/tmp/sflow# bundle exec ./bin/sflow.rb
Connecting to Logstash: logstash.local:6543
Getting switch interface names . done.
Server listening.
2015-04-16 17:14:39 -0700
{"agent_address"=>"MY_SWITCH", "sampling_rate"=>2048, "i_iface_value"=>48, "o_iface_value"=>0, "ipv4_src"=>"172.16.52.10", "ipv4_dst"=>"74.125.224.35", "frame_length"=>68, "frame_length_multiplied"=>139264, "tcp_src_port"=>49170, "tcp_dst_port"=>443}
undefined method []' for nil:NilClass /home/sysadmin/sflow/lib/sflow/snmp/iface_names.rb:33:inmapswitchportname'
/home/sysadmin/sflow/lib/sflow/storage/storage.rb:24:in send_udpjson' /home/sysadmin/sflow/lib/sflow/collector.rb:26:inblock in receive_data'
/var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:951:in call' /var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:951:inrun_deferred_callbacks'
/var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:187:in run_machine' /var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:187:inrun'
/home/sysadmin/sflow/lib/sflow/collector.rb:56:in start_collector' ./bin/sflow.rb:7:in

'

doing ctrl-c to cancel shows the following:

/var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:187:in run_machine' /var/lib/gems/1.9.1/gems/eventmachine-1.0.3/lib/eventmachine.rb:187:inrun'
/home/sysadmin/sflow/lib/sflow/collector.rb:56:in start_collector' ./bin/sflow.rb:7:in

'
/home/sysadmin/sflow/lib/sflow/collector.rb:63:in rescue in start_collector': unable to start sflow collector (RuntimeError) from /home/sysadmin/sflow/lib/sflow/collector.rb:42:instart_collector'
from ./bin/sflow.rb:7:in `'

root@logstash:/home/sysadmin/sflow# ruby --version
ruby 1.9.3p484 (2013-11-22 revision 43786) [x86_64-linux]

any ideas on how to fix this? any help would be greatly apreciated.

@TheSeraph
Copy link

Actually, I found a different way around my whole SFLOW + ELK. I can't take any credit for this, but if you want I could search for the guy who suggested it to me. Basically the workaround he did, and that I also applied was to create simple wrapper scripts that would execute sflowtool and pipe the information into logstash. A logstash config file is what executes the wrapper which executes the the wrapper, which executes sflow tool.

Here's how it went (on an Ubuntu 14.04 system)

First obtain the sflowtool for linux and build it (you'll need build-essentials if you don't have it already)

cd $home
wget http://www.inmon.com/bin/sflowtool-3.22.tar.gz
tar -xvzf sflowtool-3.22.tar.gz
cd sflowtool-3.22
sudo ./configure
sudo make
sudo make install

Now that we have sflowtool installed, we can create the wrapper script that logstash will call.

cd /var/
sudo mkdir scripts
sudo nano /var/scripts/sflowtool-wrapper.sh

And here's the actual sflowtool-wrapper.sh script. Pay attention to the /usr/local/bin/sflowtool as the path may be different on your version of linux.

#!/bin/bash
#
# Wrapper script for sflowtool when used in pipe input in logstash.
# This wrapper script ensures that the sflowtool is not running prior to start of the sflowtool.

ARGS="$@"
SFLOWTOOL_PID=$(/bin/ps -ef | /bin/grep "/usr/local/bin/sflowtool $ARGS" | /bin/grep -v "grep" | /bin/awk ' { print $2 } ')

if [ ! -z $SFLOWTOOL_PID ]; then
        kill -s 9 $SFLOWTOOL_PID
fi
/usr/local/bin/sflowtool "$@"

And now can chmod that file so it's exectuable. I tried chmod 755 but it had problems executing from logstash, so 775 worked for me. It's a fairly secured system so I'm not worried: sudo chmod 775 /var/scripts/sflowtool-wrapper.sh

Finally, in logstash config, I start with

input {
  pipe {
     type => "sflow"
     command => "/var/scripts/sflowtool-wrapper.sh -l -p 6343"
  }
}

And if you'd like, here's my full logstash config file for sflow.

###<----------------- INPUTS ----------------->###
input {
  pipe {
     type => "sflow"
     command => "/var/scripts/sflowtool-wrapper.sh -l -p 6343"
  }
}
###<----------------- FILTERS ----------------->###
filter {
  if [type] == "sflow" {
    grok {
        match => [ "message", "%{WORD:Sample_Type},%{IP:Flow_Reporter},%{NUMBER:Input_Port},%{NUMBER:Output_Port},%{GREEDYDATA:Source_MAC},%{GREEDYDATA:Dest_MAC},%{GREEDYDATA},%{NUMBER:Source_Vlan},%{NUMBER:Dest_Vlan},%{IP:Flow_Source},%{IP:Flow_Destination},%{NUMBER:Flow_Protocol},%{WORD},%{NUMBER:IPTTL},%{NUMBER:Flow_Source_Port},%{NUMBER:Flow_Destination_Port},%{WORD},%{NUMBER:Packet_Size},%{NUMBER:IP_Size},%{NUMBER:Mean_Skip_Count}" ]
        match => [ "message", "%{WORD:Sample_Type},%{IP:Flow_Reporter},%{NUMBER:int_ifIndex},%{NUMBER:int_ifType},%{NUMBER:hyper_ifSpeed},%{NUMBER:int_ifDirection},%{NUMBER:int_ifStatus},%{NUMBER:hyper_ifInOctets},%{NUMBER:int_ifInUcastPkts},%{NUMBER:int_ifInMulticastPkts},%{NUMBER:int_ifInBroadcastPkts},%{NUMBER:int_ifInDiscards},%{NUMBER:int_ifInErrors},%{NUMBER:int_ifInUnknownProtos},%{NUMBER:hyper_ifOutOctets},%{NUMBER:int_ifOutUcastPkts},%{NUMBER:int_ifOutMulticastPkts},%{NUMBER:int_ifOutBroadcastPkts},%{NUMBER:int_ifOutDiscards},%{NUMBER:int_ifOutErrors},%{NUMBER:int_ifPromiscuousMode}" ]
    }
  }

  geoip {
        source => "[Flow_Source]"
        target => "Flow_src_geoip"
        add_field => [ "[Flow_src_geoip][coordinates]", "%{[Flow_src_geoip][longitude]}" ]
        add_field => [ "[Flow_src_geoip][coordinates]", "%{[Flow_src_geoip][latitude]}"  ]
        add_tag => [ "sflow" ]
  }
  geoip {
        source => "[Flow_Destination]"
        target => "Flow_dst_geoip"
        add_field => [ "[Flow_dst_geoip][coordinates]", "%{[Flow_dst_geoip][longitude]}" ]
        add_field => [ "[Flow_dst_geoip][coordinates]", "%{[Flow_dst_geoip][latitude]}"  ]
        add_tag => [ "sflow" ]
  }

  mutate {
    convert => [ "FW_geoip.area_code", "integer" ]
    convert => [ "FW_geoip.dma_code", "integer" ]
    convert => [ "FW_geoip.latitude", "float" ]
    convert => [ "FW_geoip.longitude", "float" ]
    convert => [ "Flow_Source_Port", "integer" ]
    convert => [ "Flow_Destination_Port", "integer" ] 

    convert => [ "Packet_Size", "integer" ]
    convert => [ "IP_Size", "integer" ] 
    convert => [ "Mean_Skip_Count", "integer" ] 
    convert => [ "hyper_ifSpeed", "integer" ] 
    convert => [ "hyper_ifInOctets", "integer" ] 
    convert => [ "int_ifInUcastPkts", "integer" ]
    convert => [ "int_ifInMulticastPkts", "integer" ]
    convert => [ "int_ifInBroadcastPkts", "integer" ]
    convert => [ "int_ifInDiscards", "integer" ]
    convert => [ "int_ifInErrors", "integer" ]
    convert => [ "hyper_ifOutOctets", "integer" ]
    convert => [ "int_ifOutUcastPkts", "integer" ]
    convert => [ "int_ifOutMulticastPkts", "integer" ]
    convert => [ "int_ifOutBroadcastPkts", "integer" ]
    convert => [ "int_ifOutDiscards", "integer" ]
    convert => [ "int_ifOutErrors", "integer" ] 
  }
}

###<----------------- OUTPUT ----------------->###
output {
    stdout { }
    elasticsearch {
      protocol => "node"
      node_name => "sflow-logstash"
      cluster => "elasticsearch"
      host => "localhost"
    }
}

@etfeet
Copy link

etfeet commented Apr 17, 2015

Thank you so much. This is exactly what I was looking for!

@TheSeraph
Copy link

Glad to have helped!

@etfeet
Copy link

etfeet commented Apr 17, 2015

so i've been looking into the sflow fields and im unning into a problem. I'm trying to get the following information and graph it in kibana.

Im able to get the following information just find from the flow fields:
agent (flow source ip), srcIP, TCPSrcPort, dstIP, TCPDstPort

However, I also want to have the bytes and packets for the flow and sflow only has the byte/packet field on the interface counters and not the flow counters.

in netflow i would collect the following fields:

IPV4_SRC_ADDR, L4_SRC_PORT, IPV4_DST_ADDR, L4_DST_PORT, IN_BYTES, IN_PKTS

@message from elasticsearch

FLOW,172.16.78.254,48,0,00141cd484a1,20b399ab5093,0x0800,5,0,10.20.0.31,172.16.78.64,6,0x00,126,445,49482,0x10,1522,1500,2048

CNTR,172.16.78.254,48,6,100000000,1,3,189641622348,400597923,20605593,22994,0,0,0,119063213814,391638219,1163290,58060,0,0,2

do you have any idea how i would be able to get something similar to the netflow using sflow?

@TheSeraph
Copy link

I'm pretty sure I have the same issue to be fair, but I haven't gotten around to investigating it. I feel like you could play with the wrapper script to pass some arguments to sflowtool to see if you can get that raw information. If logstash is stopped, you should be able to call sflowtool manually and it will dump it's captures to your screen. That should also help you to debug the raw information you're receiving. Check out http://www.inmon.com/technology/sflowTools.php for some more information on sflowtool arguments

@TheSeraph
Copy link

This might be somewhat informative also: http://blog.sflow.com/2011/12/sflowtool.html

@papaalpha
Copy link

Could anyone assist with this one;

When I attempt to run the command below, as outlined in the readme, I'm getting an error complaining about snmpwalk. I don't see any reference to needing it.
I've no programming experience so I'm not sure why it's generating the snmp error data below.

[root@computer sflow]# bundle exec ./bin/sflow.rb

Connecting to Logstash: computer:6343
Getting switch interface names 2015-06-09 11:02:43 +0100
No such file or directory - snmpwalk
/opt/sflow/lib/sflow/snmp/iface_names.rb:11:in ``'
/opt/sflow/lib/sflow/snmp/iface_names.rb:11:inblock in initialize' /opt/sflow/lib/sflow/snmp/iface_names.rb:7:in each_key'
/opt/sflow/lib/sflow/snmp/iface_names.rb:7:in`each'
/opt/sflow/lib/sflow/snmp/iface_names.rb:7:in `initialize'
/opt/sflow/lib/sflow/collector.rb:54:in`new'
/opt/sflow/lib/sflow/collector.rb:54:in `start_collector'
./bin/sflow.rb:7:in`

'
/opt/sflow/lib/sflow/collector.rb:63:in `rescue in start_collector': unable to start sflow collector (RuntimeError)
from /opt/sflow/lib/sflow/collector.rb:42:in`start_collector'
from ./bin/sflow.rb:7:in `'

I'm using CentOS7, having installed the ELK stack (version 1.5) from elastic.co

EDIT - Apologies.. I installed snmp utils, and this removed the snmp error above

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants