Internet of Things tools and frameworks

Internet of Things tools and frameworks

Eclipse foundation is  having one sub project for Internet of things initiatives.

Source of this post is the Eclipse IoT website

As part of it at this moment 12 initiatives are being undertaken for

  • Services and Frameworks
  • Protocols
  • Tools


Services and Frameworks


  1. Kura is a set of Java and OSGi services that are most commonly required for IoT gateways, including I/O services, Data Services, Cloud Services, Networking, etc.
  2. Mihini, written in the Lua scripting language, provides low-level connectivity management to ensure that a reliable network connection is available, acts as an abstraction layer for underlying hardware and enables smart business data transmission between devices and servers, including the ability to consolidate data locally and use bandwidth-efficient communication protocols.





Eclipse IoT currently has the following industry services:

  • Eclipse SmartHome project is a framework that allows building smart home solutions
  • Eclipse SCADA is a way to connect different industrial devices to a common communication system and post-process as well as visualize the data to operating personnel.


IoT/M2M Protocols


MQTT

Message Queuing Telemetry Transport (MQTT) is a protocol designed to connect the physical world devices and networks, with applications and middleware used in IT and Web development, making it an ideal connectivity protocol for IoT and M2M.

OMA-DM

OMA-DM is a standard communication protocol widely used in the telecommunications industry to monitor and synchronize the state of communications devices such as mobile phones or the kind of radio modules that can be found in M2M solutions.

There is a nice introduction to OMA-DM on the developerWorks website.


IoT Tools

Set of tools for developers to carry out IoT development

  • Embedded Development
  • Simulation
  • Server Development

Lua Development Tools provides a IDE for the Lua programming language. Lua is an embedded programming language.

Details

http://www.eclipse.org/community/eclipse_newsletter/2014/february/article1.php

Hadoop 2.3 Centralized Cache feature comparison to Spark RDD

Hadoop 2.3 has two new features.
  • Support for Heterogeneous Storage Hierarchy in HDFS (HDFS-2832)
  • In-memory Cache for data resident in HDFS via Datanodes (HDFS-4949)
This post is related to Centralized cache management feature in HDFS.

It allows you to say at start of your job to cache a particular folder into memory. Applications like Hive , Impala will be able to read data directly from memory which has been cached.  The current features are SCR ( Short Circuit reads ) which allows SCR aware applications directly read from disk by passing Datanode.

Sample command

$hdfs cacheadmin -addDirective -path <path> -pool <pool-name> [-force] [-replication <replication>] [-ttl <time-to-live>]


General flow of execution



Comparing with current implementation Spark model , RDD is still superior as it maintains lineage both transformations of writes happening in the in memory data. It means that Spark can write intermediate data to RAM and work faster.

The current HDFS cache management feature does only boosts performance with reads.

So I guess still few mode improvements are needed for Hadoop to beat Spark in performance.

I am very excited to see how downstream systems like Pig , Hive and Impala will use this feature to make them process things faster. I am sure things will get better and better in Hadoop in coming few releases.

Test SSL LDAPS connection

A useful code to check SSL LDAPS connection

http://java.ittoolbox.com/groups/technical-functional/java-l/ldap-test-connection-4747814

Debug SSL connection java


Examples
  • To view all debugging messages:
    java -Djavax.net.debug=all MyApp
  • To view the hexadecimal dumps of each handshake message, you can type the following, where the colons are optional:
    java -Djavax.net.debug=ssl:handshake:data MyApp
  • To view the hexadecimal dumps of each handshake message, and to print trust manager tracing, you can type the following, where the commas are optional:
    java -Djavax.net.debug=SSL,handshake,data,trustmanager MyApp


Wolfram language tutorial

It’s amazing to see how various things are coming along to make Internet of Things reality for future.

The efforts by Semantic web people , RFID based tracking , the work of xively for connected devices all are awesome.

Today’s post is related to work done by Wolfram language project.

I saw the short demo which has been released by them and highly recommend if you have not seen it.


The connected devices project by them is what where everything will be future

Quoting

“Connected devices are central to our long-term strategy of injecting sophisticated computation and knowledge into everything. With the Wolfram Language we now have a way to describe and compute about things in the world. Connected devices are what we need to measure and interface with those things.”

The API has free version



The language has wide range of already prebuilt functions and algorithms based on real world usage.

See below


With various APIs ( e.g Xively , Wolfram ) trying to do solve the problem of connected devices , I future we will see bridge between various systems which come up to help cross pollination happen.

Wolfram language SDK is available on http://www.raspberrypi.org/ pre build to try on. It also works on various other platforms

How to get started with Wolfram language

The tutorials and language manual is available on official website at


It will be familiar for people from Mathematica background

Although it’s just a preliminary release and general release announcement is pending.

The same language is engine behind the Wolfram alpha  Sign up for trial account at


Copy paste the syntax you learn over this website and try out various things.

I am sure you will enjoy.

I would wish that Wolfram people make this open source for faster innovation and wider community acceptance.


How to fix Limited connectivity Windows 8

 

I wasted 2-3 hours for this

At last following worked for me

Open Command prompt as Administrator

1)

ipconfig /flushdns

2)

ipconfig /release

3)

ipconfig /renew

Restart your wifi or LAN connection

Things should work

Setup Scala to use Maven in Eclipse

 

Eclipse needs separate plugin for maven and scala

1)

Download Scala M2E Plugin

Update site

http://alchim31.free.fr/m2e-scala/update-site

2)

Create a new Maven project

Choose archtype as

Group id
net.alchim31.maven
artifact id
scala-archteype-simple

3)


From now you can use your normal maven commands


Note:


In the pom.xml


I tried with scala.version as 2.10.0


It was failing , i decreased it to 2.8.0 it passed. Just in case you get some error like me

Initial job has not accepted any resources; check your cluster UI

 

522640 [Timer-0] WARN org.apache.spark.scheduler.cluster.YarnClientClusterScheduler - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

Check that Spark worker has been started and it is visible at Master page

http://10.0.0.11:8080/

If not then start the worker

Example

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://precise32:7077

configure: error: missing required library GL

configure: error: missing required library GL
ERROR: configuration failed for package ‘rgl’

rgl: 3D visualization device system (OpenGL)

SystemRequirements:
OpenGL, GLU Library, zlib (optional), libpng (>=1.2.9, optional), FreeType (optional)

Solution:
sudo yum install mesa-libGLU-devel mesa-libGLw-devel mesa-libgbm mesa-libGL mesa-libGL-devel libpng-devel libX11-devel
sudo R CMD INSTALL rgl_0.93.996.tar.gz

End results

** building package indices
** testing if installed package can be loaded
Warning in rgl.init(initValue, onlyNULL) :
  RGL: unable to open X11 display
Warning in fun(libname, pkgname) : error in rgl_init

* DONE (rgl)
Making packages.html  ... done

Virtualbox this kernel requires an x86-64 cpu

 

Just make sure that you have selected the correct Version of OS while creating Virtual machine.

See the diagram below. Select 64 bit OS

image

Making Intel Dual Band Wireless-AC 7260 work in Ubuntu or other Linux

Making Intel Dual Band Wireless-AC 7260 work in Ubuntu or other Linux

Before that some quick simple summary how it works.

The Linux drivers are part of the upstream Linux kernel. But some devices needs additional firmware to tell kernel what device is all about and how to operate it.

So we need to download the firmware from Intel website and place in

/lib/firmware of the system , simple

So head to

http://www.intel.com/support/wireless/wlan/sb/CS-034398.htm

Download the tar related to kernel you have and device you have
Extract it
Place it in /lib/firmware

Reboot your machine

Done!!

Additional notes.

All of these are also part of Linux community code

http://git.kernel.org/cgit/linux/kernel/git/firmware/linux-firmware.git/tree/

In Ubuntu, firmware comes from one of the following sources:

    The linux-image package (which contains the Linux kernel and licensed firmware)
    The linux-firmware package (which contains other licensed firmware)
    The linux-firmware-nonfree package in multiverse (which contains firmware that are missing redistribution licenses)
    A separate driver package
    Elsewhere (driver CD, email attachment, website)

The ideal scenario would have been some one back porting the latest firmeware in linux-firmware for saucy. But this not been done so we can just manually
drop the file.

A additional shortcut you can follow is just download Trusty (14.04) deb file and install it. The trusty already has the firmware for it.

http://packages.ubuntu.com/trusty/all/linux-firmware/download
   
https://wiki.ubuntu.com/Kernel/Firmware



Basic concepts for functional programming

 

Some of the basic concepts for learning functional programming. I noted all of them from wikipedia.

First class functions

In computer science, a programming language is said to have first-class functions if it treats functions as first-class citizens. Specifically, this means the language supports passing functions as arguments to other functions, returning them as the values from other functions, and assigning them to variables or storing them in data structures.

Higher order functions

In mathematics and computer science, a higher-order function (also functional form, functional or functor) is a function that does at least one of the following:

  • takes one or more functions as an input
  • outputs a function

 Map function

In many programming languages, map is the name of a higher-order function that applies a given function to each element of a list, returning a list of results. It is often called apply-to-all when considered in functional form. This is an example of functoriality.

For example, if we define a function square as follows:

square x = x * x

Then calling map square [1,2,3,4,5] will return [1,4,9,16,25], as map will go through the list and apply the function square to each element.

Filter

In functional programming, filter is a higher-order function that processes a data structure (typically a list) in some order to produce a new data structure containing exactly those elements of the original data structure for which a given predicate returns the boolean value true.

Example Scala


list.filter(pred)



Or, via for-comprehension: for(x <- list; if pred) yield x


Scope

The term "scope" is also used to refer to the set of all identifiers that are visible within a portion of the program or at a given point in a program, which is more correctly referred to as context or environment.[a]

A fundamental distinction in scoping is what "part of a program" means – whether name resolution depends on the location in the source code (lexical scope, static scope, which depends on the lexical context) or depends on the program state when the name is encountered (dynamic scope, which depends on the execution context or calling context). Lexical resolution can be determined at compile time, and is also known as early binding, while dynamic resolution can in general only be determined at run time, and thus is known as late binding.

http://en.wikipedia.org/wiki/Scope_(computer_science)

Closure

In programming languages, a closure (also lexical closure or function closure) is a function or reference to a function together with a referencing environment—a table storing a reference to each of the non-local variables (also called free variables or upvalues) of that function

Anonymous function

  In computer programming, an anonymous function (also function constant, function literal, or lambda function) is a function defined, and possibly called, without being bound to an identifier. Anonymous functions are convenient to pass as an argument to a higher-order function and are ubiquitous in languages with first-class functions such as Haskell. Anonymous functions are a form of nested function,

List Comprehension


A list comprehension is a syntactic construct available in some programming languages for creating a list based on existing lists. It follows the form of the mathematical set-builder notation (set comprehension) as distinct from the use of map and filter functions.

Example scala

val s = for (x <- Stream.from(0) if x*x > 3) yield 2*x

How to give input to command from a file

How to give input to command from a file

< 
A < symbol connects the command’s STDIN to the contents of an existing file.
Example
$ mail -s "Mail test" johndoe < /tmp/mymessage
Read from text file and send as mail

How to store command output in text file

>
Replaces the file’s existing contents
Example
echo "This is a test message." > /tmp/mymessage

To redirect both STDOUT and STDERR to the same place, use the >& symbol.
To redirect STDERR only, use 2>.

Example

$ find / -name core > /tmp/corefiles 2> /dev/null
This command line sends matching paths to /tmp/corefiles, discards errors, and
sends nothing to the terminal window.

How to append to contents of a file

>> 
Appends to the file
echo "This is a test message." >> /tmp/mymessage


How to execute command conditionally on success of command.

$ lpr /tmp/t2 && rm /tmp/t2

To execute a second command only if its precursor completes successfully, you
can separate the commands with an && symbol

cp --preserve --recursive /etc/* /spare/backup \
|| echo "Did NOT make backup"

Conversely, the || symbol executes the following command only if the preceding
command fails (produces a nonzero exit status).