Sunday, December 06, 2015

I'm so over email!

Companies today still use email as their de-facto means of communication, which is quite sad. The problems with email are:

  • Asynchronous - in our world of real-time everything, is surprising that email has lasted so long.
  • No built-in feedback: Receipts dont work. Because its async, people to resort to phoning or asking people if they received their mail. This is crazy. You never ask people if they received your whatsapp, because if gives you instant feedback with the ticks.
  • Takes too long. People understand that email is send and wait, so they therefore build their workflow around that. You have to wait for a reply!
  • Far too many - its too easy to CC as many people as possible. Its not focussed enough.
  • The protocol is far too antiquated. Emails can get lost and not delivered.
  • When a mail trail gets too long, its difficult to follow who said what. And the sad part is that all the signatures and headers take most of the space.
Email is so clunky. Email feels to me how people behaved before email - its almost as bad as sending physical mail. Its too slow, and it might not even get there. The answer may not be relevant if it arrives.

At my workplace, we actively use Instant Messaging. No, not Lync! We use Telegram. Its realtime, and quick. Its perfect for most situations, especially:
  • when you actively working with someone on a problem. Because we have domain experts, if you putting in or supporting a problem, you need real time communication with others even if they based on the next floor, or another office some where else. Telegram makes it seem like the guy is next to you, its that quick.
  • Rooms or groups, based on topic: easy to send info to a group of people. No need to CC!
  • Apps for PC/Mac, mobile and web
  • Easy to send files
  • Parses and loads web links/URLs
  • Can reply to or quote previous or specific messages
It just feels better than email. 

Having said that, IM or Telegram is not perfect. There are definitely situations where you cant respond now, and need to come back to it. Because IM flows downwards like a list, that item is quickly lost. I often force myself not to read a particular Telegram group, because the last discussion had a point I needed to come back to. So now I have to forcefully disconnect from the discussion because otherwise I will lose the action point.

Email is not dead, yet, but it definitely needs to evolve. It needs to be relevant

A few years ago Google came out with Wave. I thought it was perfect. It added real-time capabilites to email. Looking back, it seems to be the perfect mix between the strengths of email and IM. Sadly, they care too much about the technical prowess of the protocol than usability. 

We tried Slack a while ago, but it seems to suffer from the same weaknesess on IM and Telegram. There's just no way to mark a message as unread. Also, you need to keep on inviting people, which hampers its uptake in a large organisation. If there was a way to use it in greenfields company, where you not choosing between email and IM, then Slack may work. 






Tuesday, May 05, 2015

The new age of learning

Assumption: You are a working individual. Your company company offers training courses, and because there is dedicated budget for it, and because its part of your KPIs, you go on a course every year. Usually its a vendor related course, or maybe a TOGAF course here and there. But its boring, apart from the free lunch coupons. But how will that keep your skills relevant for the next 20 years? Can you afford to go (back) to university to (re) do your CS or Engineering degree?

I have recently been blown away by the plethora of online training. This is not just CBT Nuggets (which was very good though, it helped me get by CCNA). I am talking about online learning that is two way, and customised and focussed for your specific area. Its as good, or maybe even better, than a good university degree. MooCs, and its variants, are changing the way we will learn.

"Well known MOOC providers include courses from: EDX, Coursera, Udacity and Khan Academy with content supplied by some of the leading universities and technology companies around the world such as MIT, Harvard, Berkerley, Stanford, Google, AT&T and Facebook"
 
I recently took a MooC from Coursera for Web Application Architectures. It was a ahigh quality course. I watched the videos, completed the quizzes after each video, and then completed the assignments. Each contributed to the final mark. To make sure that I would dedicate myself over the 6 weeks, I paid for the Verified Certificate. It was worth it.

The content from MooCs covers all learning subjects. I believe some content even tops University offered equivalents because of the real life applications, e.g. these Micro Degrees on Big Data: https://www.coursera.org/course/datasci and https://courses.edx.org/courses/MITx/15.071x_2/1T2015/info

Regarding programming, I have come across some really cool places to learn how to code:

Learning web development - my rails journey

Even though I have not been a typical software developer in my career, I have used my coding skills (google, copy, paste) to create PoCs at work. Recently it has just been limited to pulling in a WSDL in java to test how the SOAP API works. I tried my hand at Android development a few years back. But the one place that I never really dabbled in was the web. I did help to maintain a few JSP pages a few years ago,  but I barely knew what I was doing. And with the rise of responsive websites, I have always regretted not knowing anything about HTML, CSS, javascript and web frameworks.

In my pursuit of relevance, I recently took a MooC course about Web Application Architectures: https://class.coursera.org/webapplications-003/quiz

It explains MVC, HTTP and a bunch of related concepts, using Ruby on Rails as the tool. I dont regret learning rails, and I have since started expanding my learning with these resources:
I highly recommend railstutorial - it is extremely well written, and takes you through the concepts while developing a few apps.

Not related to Rails, but it contains a nice story on how frameworks win over flat coding: http://symfony.com/doc/current/book/from_flat_php_to_symfony2.html

Whats next? According to http://www.quora.com/What-should-a-fullstack-developer-know-in-2015, I need to know these:

  • HTML
  • CSS
  • Javascript

Thursday, April 02, 2015

Building the right products

I've been doing 'product development' in various guises for a while now.  Recently, I even have the word 'product' in my job title. It basically involves helping companies develop and build new services, offerings and products, that they sell to their customers. It requires understanding the market, what customers want, what they need, what they pay, what competitors offer. My specific responsibility in this development cycle is the technical architecture of the product.
All in all, product development comes down to two things:

  1. Build the correct products
  2. Build products correctly

The 2nd one is we I/we normally focus. It includes the Technical Architecture, and focusses on building systems that are scalable, extendable, flexible and performant. We know how to do these things.

But what about the 1st one: making sure you actually building what the market and customers want and need? This usually involves a business case - that shows the need for the product, and what its returns will be. But this step (from my experience) is mostly superficial - its glossed over, and does not contain enough market and customer research. Usually this is because this product was promised by someone high up, or its a me-too product, and come what may, it will be developed and launched.

I always hear people complain about the lack of customer and market research. In some cases, if the company is really trying to be innovative, there might not be any research around. But in most cases, there should be data that can be gathered to shape the new product. I believe only a few market leaders  (e.g. Apple) exist that can invent new products, and the market will follow. I recently read about MicroStrategy, that was chosen to be amongst the top 50 companies in their ability to read the market. This leads to really cutting edge product development, and involves quite a large amount of risk I would imagine.

When companies are building new products, they have a budget, and they want the returns to be maximised. So they try to choose a product that will lead to the greatest return, based on the investment. It might not be the most snazzy or flashy product, but as long as it can promise good returns, it will be a good investment. Its like a small store or shop owner: if he is thinking of what additional merchandise to stock in this shop, he will choose something that is likely to sell well, and that is within his budget.

So the question is how do companies gather sufficient data and requirements to build new products:
  • Where does requirements for products come from?
  • How do companies know which products to launch?
These might be done in using different ways, sometimes employing many techniques at the same time, in different departments:

  • Research and Development departments - they might consider risky and future based technologies and products
  • Incubation divisions - these might be used to venture into new markets, and building completely new capabilities that the company does not have.

I imagine that even with these there is quite a high hit-and-miss ratio.  And because of changing markets and economics, things that made sense last year all of a sudden dont make any sense this year. Think of Mxit not being able to deal with the coming of smartphones. Heck - atleast Duke Nukem Forever got released.

I am still quite struck by an experience a few years ago. Two large mobile operators, at roughly the same time, launched a feature phone IM application. They both chose the same vendor. Less than 3 years later, both products were decommissioned as failures.  That means both sets of operators got the research wrong, both went down the same path, and both failed. Clever people in both companies wrote business cases, that got approved by even more clever people. And they all got it wrong. Wow....product development is hard!


Thursday, February 19, 2015

Big Data relevance

I never really understood the Big Data hype. Its data, but lots of it, so you need special ways to deal with it. Big deal!

But this article about how twitter processes large amounts of data really got me excited. This post too explains all the moving parts in big data, which I had no idea about before this (I heard about Hadoop once or twice)

What made it more relevant for me was this article about how big data and this fancy-named software is actually used.  This is the best part:

The data science team embeds itself with the product team and they work together to either prove out product managers’ hunches or build products around data scientists’ findings. In 2013, Garten said, developers should expect infrastructure that lets them prototype applications and test ideas in near real time. And even business managers need to see analytics as close to real time as possible so they can monitor how new applications are performing.
Now for the last few weeks I have been prattling along about relevance: may be even re-inventing yourself and learning new skills. So perhaps Big Data is a good thing to learn and get into. And its not only just whitepapers and vapourware - the industry has a real need for these skills.  The question is how long will a person with typical current skills, SQL in this case, be relevant in tech industry if these new Big Data systems are becoming more prevalent. Will a typical RDBMS and SQL become the next COBOL very soon?

There is real world training as well from Udacity(Cloudera), Coursera and EDX to help a person build these skills while still in his old job.




Tuesday, February 03, 2015

Full Stack

I've spoken about Relevance before in IT - specifically adapting to fundamental changes in the industry. Some guy (+Anban) mentioned that one of the changes he sees is being full stack - being not only able to take a requirement through the full development cycle of elicitation, design, and documentation (which nobody will read), but also through development and implementation as well. DevOps if you must.

So I think there is a need to change with the times, and learn how to do web development. These new frameworks from Google and Facebook, and perhaps training like this nanodegree can really help make that change and you keep relevant?

Tuesday, January 27, 2015

Documentation

I like writing documents. Because of the way I was indoctrinated early on in my career, I learnt that documents with fancy names meant that you were more mature in your process. Documents are meant to transfer information around between different people, departments and companies. Its meant to solidify and get agreement on specifications. There's different templates, formats, audiences...the whole deal.

So early on I relished writing documents. I was told that it was important to document the requirements for a system, then document the design of a system after it was built, then document the ...... you get the message.
I then became a consultant, and documents were the measure of progress, and sometimes even the actual thing that we sold. I have worked on projects where for 3 months a team of us created a single power point deck.
In the IT industry, documentation is the lingua franca, the currency of the day. Its what we trade in. "I will send you the Technical Specification once you send me the Business Requirements". So we spend our day writing reams of documents, and project meetings ticking off the documents we have written. But there is one big problem with this whole process:

Nobody reads documents!

There, I said it! Nobody ever reads them, until the person has left the company and the next guy needs to the figure what has been done. This ignores the people who review documents just to nitpick the format, section placement, etc. But nobody reads documents to actually transfer knowledge, and get agreement on a project thats active. Because its just simply a broken way to transfer knowledge - you lose context, the formats and templates force you to write in Enterprise, so you lose the gist of what was intended.

Then there is the technicalities - how do I provide feedback on a document that I actually have read: using comments in Word, separate email referring to section numbers, etc. How do I provide comments/feedback on wiki articles?

C'mon, we need a better way to do this?

Thursday, January 15, 2015

Relevance

IT was an exciting field to get into. It was young, and even for the layman, fairly easy to get into. You could easily build up skills, and make a name for your self. Even without a degree, there are many courses, and guys (mostly) got jobs. Our generation replaced the 'main-frame' guys - we used our Java to mock their COBOL,  as we showed them Linux running on our laptops that blew their mind.

But after 10 years now, I can see the 'gotchas'.  There is no "science" to IT. It changes far too fast to mature. And thats the double-edged sword of IT - low barrier to entry as supposed to medicine or engineering, but because it changes so fast, we are becoming quickly out-dated. Not because we don't have the desire and capability to learn and up-skill, but because we will miss the next big mind-shift, and we won't be fast enough to catch up. The next generation will take our combined learnings as innate and obvious and basic, and will define new architectures that we just won't 'GET'.

They say that Computer Science is about learning fundamentals, that can be applied in other areas. Degrees (Engineering, Computer Science, etc) is more about learning how to learn, then learning any language or system in particular. Its about the basic building blocks of Operating Systems, and databases, and other stuff. They should be able to re-learn and re-invent themselves as systems and tools change. But I argue that in 20 years or less, most of those building blocks (data structures, CPUs, networks) will be so vastly changed, that they wont apply any more. They will be so far abstracted away by new languages and frameworks, that you wont even have to think about DNS and IP any more, it will just be assumed. So all those things that we are so particular about when designing systems, like High Availability, Modularity and so forth, will be 'built-in' to the new building blocks of the future, rendering our past experiences and knowledge as null and void.

The people of a few hundred years ago had to primarily worry about food and heat. Those were their basic building blocks of their life. Nowadays, food and heat are just assumed to be there. We can get it with no effort - its included in everything we know and see. So if a person of the past appeared in this day, his complete life of experiences built around learning how to hunt, and how to generate fire, is completely useless. In the same way, as the building blocks of IT change, at some point, it will render past generations (us) irrelevant.

I think thats the key thought: relevance! Will the skills that we have built now, still be relevant in 20 years time. For a doctor and engineer, as he ages, he just gets better at what he does. For guys in IT, change and time is the killer.

It would be interesting to track how a doctor who qualified ten years, would continue to earn over the next fifty years, compared to an IT-skilled person.


Some interesting posts I came across:

Monday, January 05, 2015

Learning and Training

I've just about completed 10 years in the corporate world. Since leaving campus, opportunities for formal studying and training have been difficult to come by. You have to have a plan, be dedicated, and make the time, while being sure to be fair and balance life, family and work.

As I look back, I realise I should have studied more. Here is a quick look at what I have done so far:


  • 2005 - 2007: At ECN, I did no formal training. I suppose I was just happy to be finally done with exams after 15 years or so.
  • 2008 - 2010:  Early in 2008 just after I joined, I decided I wanted to do more studying. I found out about a MSc Eng in Telecoms that Wits was offering. Some of the guys had completed while still at Wits before coming to work. I planned to complete it over 3 years. The first half of the year was for the course work and lectures, and the 2nd half of the year for research. I did two courses in 2008, and two courses in 2009. Each course lasted 6 weeks, with lectures of 3 hours, two days a week. I used to wake up at 3am to study, come in to work at 7am, leave for lectures at 8:30, and come back at lunch, and stay till late. 3 hour exam at the end of each course! When I told my manager that I was going to make up for all the lost work he said "I dont care what hours you do, just finish the work". Awesome!!! After 2 years of studying most weekends, I gave up. Best decision I made, with absolutely no regrets. Family is more important than this.
    • During the same time at MTN, I did a one day Time Management course offered off-site. I still remember some of the things I learned, and have been practicing on it. 
    • I also did a one-week F5 BigIP Load-Balancer support engineer course
    • Ericsson GSM one-week course
    • SmartTrust DP support engineer one week course
    • I was scheduled for an Oracle DB one-week course, but my resignation meant I could note make it
  • 2011 - 2013: Two major training events at Accenture. Core Consulting School in Chicago in March 2011 - this is one of mandatory schools. They teach the Accenture methodology, primarily for Project Management and client interaction. Very good learning experience - about 300 similar skilled people from all over the world for a week.
    • The other major one was Technology Architecture school in London. Very awesome learning experience
    • TMForum training - one week
    • I also self-studied towards Cisco CCNA1 in May 2011, and CCNA2 during July 2011. In December 2011 and January 2012, I self-studied CCDA
  • 2013 - 2015: During December 2013 at IS I self-studied SIP School SSCA
    • Formal CCNA SP 1 training for a week, and the exam the next week.
    • During December 2014 / January 2015, self studied EMC Information and Storage Management. Exam coming soon I hope
    • I also did a on-line VCE Associate course with exam, that took about 6 hours in December
I hope to do a VMWare course and Exam this year, as well as complete CCNA SP 2.

Huawei USB dongle


I always forget the colour of the indication light on 3G dongles, to know when its connected, and the signal strenght. Hopefully this will help:


http://mybroadband.co.za/vb/showthread.php/526134-Balance-checking-of-MTN-prepaid-SIM-in-remote-device

http://bigcowpi.blogspot.com/2013/03/pi-3g-router-query-signal-quality-and.html

Green, blinking twice every 3sThe USB Stick is powered on.
Green, blinking once every 3sThe USB Stick is registering with a 2G network
Blue, blinking once every 3sThe USB Stick is registering with a 3G/3G+ network
Green, solidThe USB Stick is connected to a 2G network.
Blue, solidThe USB Stick is connected to a 3G network
Cyan, solidThe USB Stick is connected to a 3G+ network.

Pi: detailed setup


Some notes I took when I setup the Pi. I have since Imaged the SD card, so I wont have to go through this again


General


  1. sudo raspi-config
    1. Change hostname
    2. Change user pi password
    3. Change timezone

Setup darkice

  1. sudo apt-get install lame libtwolame0 - to install lame required to have mp3 support
  2. install darkice binary package already compiled for mp3 supprt
  3. sudo cp /usr/share/doc/darkice/examples/darkice.cfg /etc/ - to configure darkice
  4. edit crontab for time script
  5. add record and stream scripts to rc.local for starting up at boot

Setup USB sound card

  1. modify alsa-base to make USB sound card the first (0) device
  2. reboot
  3. check that sound card is picked up correctly. Run following commands:
  4. cat /proc/asound/cards
  5. amixer 
  6. arecord -l
  7. aplay -l
  8.  Set sound levels: alsamixer 
  9. arecord -f S16_LE -D hw:0,0 -r 48000 test.wav
  10. aplay test.wav

Create seperate FAT32 extension - this useful to store recordings on a seperate partition to prevent / from overfilling, and to make it FAT32 so that it can be mounted in a Windows machine

  1. sudo apt-get install dosfstools
  2. sudo cfdisk /dev/mmcblk0 - Using the keyboard create a new primary partition in the free space, and change the type to FAT32 which is option 0B (W95 FAT32). Select Write to save the changes, then select Quit.
  3. reboot
  4. sudo cfdisk /dev/mmcblk0p3
  5. sudo mkfs.msdos /dev/mmcblk0p3

Setup 3G USB dongle

  1. sudo apt-get install sg3-utils
  2. sudo /usr/bin/sg_raw /dev/sr0 11 06 20 00 00 00 00 00 01 00
  3. echo 'SUBSYSTEMS=="usb", ATTRS{modalias}=="usb:v12D1p1F01*", SYMLINK+="hwcdrom", RUN+="/usr/bin/sg_raw /dev/hwcdrom 11 06 20 00 00 00 00 00 01 00"' > /etc/udev/rules.d/10-HuaweiFlashCard.rules
  4. modify /etc/network/interfaces

Reverse SSH tunnel
  1. ssh-keygen -t rsa
  2. copy key to server
  3. insert ssh reverse command to /etc/rc.local
syslog


webmin




Pi: USB sound card nightmare


As a follow on to the previous post about the raspberry Pi

I bought two USB sound cards from Matrix Warehouse. They were these two C-Media based chipsets:

7.1: http://www.amazon.com/Channel-External-Sound-Audio-Adapter/dp/B007HISGRW/ref=sr_1_3?ie=UTF8&qid=1382967519&sr=8-3&keywords=usb+sound+card
5.1: http://www.amazon.com/Virtual-5-1-surround-External-Sound-Card/dp/B000N35A0Y/ref=pd_bxgy_pc_img_y

The 7.1 was only giving static. It turns out that the mic I was using was broken, but I learned quite abit about Linux sound tool sets.

These posts were very helpfull:
http://computers.tutsplus.com/articles/using-a-usb-audio-device-with-a-raspberry-pi--mac-55876
http://www.raspberrypi.org/forums/viewtopic.php?p=314611#p314611
http://www.linuxcircle.com/2013/05/08/raspberry-pi-microphone-setup-with-usb-sound-card/

the 7.1

dmesg:
    3.303196] usb 1-1.5: new full-speed USB device number 4 using dwc_otg
[    3.434770] usb 1-1.5: New USB device found, idVendor=0d8c, idProduct=013c
[    3.449598] usb 1-1.5: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[    3.473047] usb 1-1.5: Product: USB PnP Sound Device
[    3.479635] usb 1-1.5: Manufacturer: C-Media Electronics Inc.      
[    3.513315] input: C-Media Electronics Inc.       USB PnP Sound Device as /devices/platform/bcm2708_usb/usb1/1-1/1-1.5/1-1.5:1.3/input/input0
[    3.551191] hid-generic 0003:0D8C:013C.0001: input,hidraw0: USB HID v1.00 Device [C-Media Electronics Inc.       USB PnP Sound Device] on usb-bcm2708_usb-1.5/input3
[    3.997860] udevd[156]: starting version 175
[    6.413474] bcm2708-i2s bcm2708-i2s.0: Failed to create debugfs directory

[    6.848620] usbcore: registered new interface driver snd-usb-audio

pi@pi-roshnee-lm ~/recordings $ cat /proc/asound/cards 
 0 [Device         ]: USB-Audio - USB PnP Sound Device
                      C-Media Electronics Inc. USB PnP Sound Device at usb-bcm2708_usb-1.5, full spee
 1 [ALSA           ]: bcm2835 - bcm2835 ALSA
                      bcm2835 ALSA
pi@pi-roshnee-lm ~/recordings $ lsusb 
Bus 001 Device 002: ID 0424:9514 Standard Microsystems Corp. 
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp. 
Bus 001 Device 004: ID 0d8c:013c C-Media Electronics, Inc. CM108 Audio Controller

pi@pi-roshnee-lm ~/recordings $ amixer 
Simple mixer control 'Speaker',0
  Capabilities: pvolume pswitch pswitch-joined penum
  Playback channels: Front Left - Front Right
  Limits: Playback 0 - 151
  Mono:
  Front Left: Playback 136 [90%] [-2.88dB] [on]
  Front Right: Playback 136 [90%] [-2.88dB] [on]
Simple mixer control 'Mic',0
  Capabilities: pvolume pvolume-joined cvolume cvolume-joined pswitch pswitch-joined cswitch cswitch-joined penum
  Playback channels: Mono
  Capture channels: Mono
  Limits: Playback 0 - 127 Capture 0 - 16
  Mono: Playback 98 [77%] [18.37dB] [off] Capture 14 [88%] [20.83dB] [on]
Simple mixer control 'Auto Gain Control',0
  Capabilities: pswitch pswitch-joined penum
  Playback channels: Mono
  Mono: Playback [on]

pi@pi-roshnee-lm ~ $ aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: Device [USB PnP Sound Device], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: ALSA [bcm2835 ALSA], device 0: bcm2835 ALSA [bcm2835 ALSA]
  Subdevices: 8/8
  Subdevice #0: subdevice #0
  Subdevice #1: subdevice #1
  Subdevice #2: subdevice #2
  Subdevice #3: subdevice #3
  Subdevice #4: subdevice #4
  Subdevice #5: subdevice #5
  Subdevice #6: subdevice #6
  Subdevice #7: subdevice #7
card 1: ALSA [bcm2835 ALSA], device 1: bcm2835 ALSA [bcm2835 IEC958/HDMI]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

pi@pi-roshnee-lm ~ $ arecord -l
**** List of CAPTURE Hardware Devices ****
card 0: Device [USB PnP Sound Device], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0

pi@pi-roshnee-lm ~/pi_scripts $ arecord -f S16_LE -c1 -r44100 -t wav test.wav
Recording WAVE 'test.wav' : Signed 16 bit Little Endian, Rate 44100 Hz, Mono
^CAborted by signal Interrupt...
pi@pi-roshnee-lm ~/pi_scripts $ aplay test.wav 
Playing WAVE 'test.wav' : Signed 16 bit Little Endian, Rate 44100 Hz, Mono


pi@pi-roshnee-lm ~/recordings $ arecord -D plughw:0 -r 48000 test.wav
Recording WAVE 'test.wav' : Unsigned 8 bit, Rate 48000 Hz, Mono
^CAborted by signal Interrupt...
pi@pi-roshnee-lm ~/recordings $ arecord -D plughw:0 -r 16000 test.wav
Recording WAVE 'test.wav' : Unsigned 8 bit, Rate 16000 Hz, Mono
^CAborted by signal Interrupt...
pi@pi-roshnee-lm ~/recordings $ arecord -f S16_LE -D plughw:0,0 -r 48000 test.wav
Recording WAVE 'test.wav' : Signed 16 bit Little Endian, Rate 48000 Hz, Mono
^CAborted by signal Interrupt...
pi@pi-roshnee-lm ~/recordings $ 


pi@pi-roshnee-lm ~/recordings $ cat /boot/cmdline.txt p. 
dwc_otg.lpm_enable=0 dwc_otg.speed=1 console=ttyAMA0,115200 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait
pi@pi-roshnee-lm ~/recordings $ 

Bandwidth allocation sizing


When trying to calculate bandwidth allocation for a particular protocol, you would need to understand what the size of the actual payload/data is, then calculate what the bottom layer headers would add as overhead. Also, you would need to know what a particular data/payload would measure on Data Links (Layer 2), e.g. Ethernet, Ethernet 801.1Q (VLAN), HDLC, PPP, Frame Relay, Fiber

The typical example is a G.729 voice packet. Just the payload of the packet is 8 kbits, then we add the overheads of the lower layers:

Overhead Byes/sec Total
G729 20 20
Layer 5 RTP 12 32
Layer 4 UDP 8 40
Layer 3 IP 20 60
Layer 2 Ethernet 38 98


Some links I came across:
http://www.techrepublic.com/blog/linux-and-open-source/use-wireshark-to-inspect-packets-on-your-network/
http://www.cisco.com/en/US/tech/tk652/tk698/technologies_tech_note09186a0080094ae2.shtml
http://sd.wareonearth.com/~phil/net/overhead/
http://aconaway.com/2011/01/10/network-protocol-overhead/


http://networkengineering.stackexchange.com/questions/2793/bandwidth-calculation-for-protocol-over-different-data-physical-links
http://networkengineering.stackexchange.com/questions/2789/wireshark-protocol-hierarchy-explanation