Samsung Developer Conference 2018 Registration Open

I previously mentioned that the dates for the Samsung Developer’s conference were announced. Registration is now open.  If you register before September 12 you can register for the lowest available price. The registration form is available over at https://www.samsungdeveloperconference.com/ . Registration is also possible on site. Currently Registration is 299 USD (+tax). After the 12th it will go up to 399 USD. On site registration is 499 USD.

Registration for the Samsung Developer's Conference, 2018

Obtaining the Connection String for a Provisioned Windows IOT Device

Playing with the code that I was using to get data from my car and stream it to the cloud I did something that I knew was a no-no; I hard coded the connection string in the code. There’s a number of reasons to not do this*; it’s less secure as someone can potentially extract the connection string and use it for unauthorized access and if the connection string ever needs to change then code needs to be recompiled and redeployed.

When a Windows IOT device is provisioned there is a connection string that is managed by the device; your application can take advantage of this and need not worry about the details of how it is stored. To make use of this there are a few libraries that you need to add to your UWP project. These include the followings.

  • Microsoft.Azure.Devices
  • Microsoft.Azure.Devices.Client
  • Microsoft.Devices.Tpm

With the classes in these libraries you can obtain the ID of the device and then use that ID to request an Azure DeviceClient class that is initialized with the connection string that the device is managing.

Here’s the code to do this.

DeviceClient _deviceClient;

void InitClient()
{            
    TpmDevice tpm = new TpmDevice(0);
    string hostName = tpm.GetHostName();
    string deviceId = tpm.GetDeviceId();
    string sasToken = tpm.GetSASToken();            
    var client = DeviceClient.Create(
        hostName, 
        AuthenticationMethodFactory.CreateAuthenticationWithToken(deviceId, sasToken),
        TransportType.Mqtt
    );
}

Connecting to Bluetooth RFCOMM with UWP

Soon I will be finished with a project for streaming engine data from a car and into an Azure Data Lake for analysis. I plan to post the full code for this project. But before I do I wanted to talk just about making the Bluetooth connection. The last time I wrote on this procedure it was for a Windows phone 8 project. While part of the UWP ancestry the method of connection in that project would not work in a UWP application today. Things have changed since then and the code that I used there for establishing a connection would not work today.  Treat the code used in this project as psuedo code for now. While it was copied from a working project I’ve done some modifications to simplify it and focus on a few things here.

Getting information from the engine is easy. Every car sold in the USA (and many  parts of the world) have a diagnostic connector that can be used to query various data from the car. Getting this data is just a matter of having an interface that uses what ever protocol (there’s more than one) that your car happens to use. Don’t worry, it’s not necessary to figure this out yourself. There are products that will abstract this away and provide a common interface and protocol that can be used to communicate with all of these different protocols. One of the most popular solutions is based on a chipset known as the ELM327. Solutions based on the ELM327 (or an ELM327 clone) will give you a serial interface through which you can send AT commands and other queries to get this information. The products that I am using use a Bluetooth interface for communicating with a computing device. You can also find implementations that use WiFi and RS232.

For development and testing there’s going to be a lot of time spent with a computer in or near a car (unless you also go out and purchase an OBD emulator). To minimize the amount of time that you have to spend around a car you may want to power one of the adapters outside of a car. I did this using a battery (sometimes a power supply) and and an OBD wire harness. Pin number 4 on the harness needs to be connected to ground and pin 16 to positive. A voltage range of 9 to 16 volts should work fine. With the device connected to power you can query information about the device itself but nothing about the car.  While this doesn’t sound very useful at first it’s enough to get the basic Bluetooth connectivity code working.

OBDSetup

For development hardware I am using a Dragonboard 410x running Windows IOT. The UWP based solution, while targeting Windows IOT, runs just fine from a PC. I believe it will run fine from a Raspberry Pi 3 also. Just for fun I tried an old Windows 10 device that I still have in the house and the code ran without any modifications needed from there. So grab what ever Windows 10 device that you have and try it out!

For the application to be able to connect to the device over Bluetooth there are some capabilities that must be declared in the UWP application. I don’t suggest using the Visual Studio for doing this. It doesn’t expose all of the necessary capability settings that are needed. It’s easier to do this in the text editor.

Right-clicking on the project’s manifest gives the option to open it in the code editor. Scrolling down to the Capabilities section I added the following. Some of these properties are needed to access the Bluetooth serial port. Some others are there because they are parts of the project that I will talk about in future post.

  
  <Capabilities>
    <apability Name="internetClient" />
    <DeviceCapability Name="bluetooth" />
    <DeviceCapability Name="proximity" />
    <DeviceCapability Name="location" />
    <DeviceCapability Name="bluetooth.rfcomm">
      <Device Id="any">
        <Function Type="name:serialPort" />
      </Device>
    </DeviceCapability>
  </Capabilities>

I had already paired the Bluetooth device with my Windows IOT device. While the devices are paired with each other the application doesn’t know about the pairing. It still needs to discover the device. To do this I use a device watcher to scan for devices. I know the one that I want to find is named “OBD”. I’ve hard coded this into my solution. The end solution that I’m making doesn’t have a UI, If it did then I would have also made use of the UI that allows a user to select a device being paired.

DataReader _receiver;
DataWriter _transmitter;
StreamSocket _stream;
DeviceWatcher _deviceWatcher;

string[] requestedProperties = new string[] { "System.Devices.Aep.DeviceAddress", "System.Devices.Aep.IsConnected" };
_deviceWatcher = DeviceInformation.CreateWatcher("(System.Devices.Aep.ProtocolId:=\"{e0cbf06c-cd8b-4647-bb8a-263b43f0f974}\")",
                                               requestedProperties,
                                               DeviceInformationKind.AssociationEndpoint);
_deviceWatcher.Stopped += (sender,x)=> {
    _isScanning = false;
    Log("Device Scan Halted");
};
EngineDataList.Add("started");
_deviceWatcher.Added += async (sender, devInfo) =>
{
    Log($"Found device {devInfo.Name}");
    System.Diagnostics.Debug.WriteLine(devInfo.Name + "|" + devInfo.Kind + "|" + devInfo.Pairing);
    if (devInfo.Name.Equals("OBDII"))
    {
     // More Code Goes Here
     //Lets talk about that in a sec
    }
};

_deviceWatcher.Start()

The above code will give result in the DeviceInfo object for the Bluetooth adapter being returned. Now we can connect to it and open up streams for reading and writing. The code for creating these streams would be where I have the comment placed in the above. The following is the code that would go in that place.

try
{
    DeviceAccessStatus accessStatus = DeviceAccessInformation.CreateFromId(devInfo.Id).CurrentStatus;
    if (accessStatus == DeviceAccessStatus.DeniedByUser)
    {
        Debug.WriteLine("This app does not have access to connect to the remote device (please grant access in Settings > Privacy > Other Devices");
        return;
    }
    var device = await BluetoothDevice.FromIdAsync(devInfo.Id);
    Debug.WriteLine(device.ClassOfDevice);

    var services = await device.GetRfcommServicesAsync();
    if (services.Services.Count > 0)
    {
        Log("Connecting to device stream");
        var service = services.Services[0];
        _stream = new StreamSocket();
        await _stream.ConnectAsync(service.ConnectionHostName,
        service.ConnectionServiceName);
        _receiver = new DataReader(_stream.InputStream);
        _transmitter = new DataWriter(_stream.OutputStream);
        
        ReceiveLoop();
        QueryLoop();
        _deviceWatcher.Stop();
    }
}catch(Exception exc)
{
    Log(exc.Message);
}
catch
{
    Log("Native Error");
}

After this runs the devices are connected to each other. The devices would interact by the Win IOT device sending a query and getting a response. For now the only query that will be sent is the string “AT\n”. On devices that accept AT command sets the command AT should only result in the response OK. This is useful for testing the connection without actually having the device perform an operation that would change the state of the system being used.

void QueryLoop()
{
    Task t = Task.Run(async () =>
    {
        
        while (true)
        {
            _transmitter.WriteString(msg);
            _transmitter.WriteString("\r");
            await _transmitter.StoreAsync();
            await Task.Delay(50);
        }
    }
    );
}

void ReceiveLoop()
{
    Task t = Task.Run(async () => {
        while (true)
        {
            uint buf;
            buf = await _receiver.LoadAsync(1);
            if (_receiver.UnconsumedBufferLength > 0)
            {
                string s = _receiver.ReadString(1);
                receiveBuffer.Append(s);
                if (s.Equals("\n")||s.Equals("\r"))
                {
                    try
                    {
                      Debug.WriteLine("Message Received:"+receiveBuffer.ToString());
                        receiveBuffer.Clear();
                    }
                    catch(Exception exc)
                    {
                        Log(exc.Message);
                     
                    }
                }
            }else
            {
                await Task.Delay(TimeSpan.FromSeconds(0));
            }
        }
    });
}

When the above runs the debug output stream will print a series of OK responses. To do something more useful with it we need to know what commands to send to it and how to parse the responses. If you send the command 01 0C while the adapter is connected to a car you will get back a hex response that starts with 41 0C followed by some additional hex numbers. The additional hex numbers are the RPMs of the car. That’s one of the metrics I started off querying because it’s something that I can change without driving the car; for this I can pull the car out of the garage, put it in the driveway, and rev the engine without worrying about carbon monoxide poisoning. I’ll talk about how to find out what metrics that you can query in your car in my next post on this project.

Samsung Developer Conference 2018 Dates

sdc2017logo

The dates for the 2018 Samsung Developer Conference have been announced. Outside of the dates and the location (which has been the same over the past several years) there’s not any additional information available. But if you think you might go now is the time to mark your calendars.

It’s November 7 and 8. That is a Wednesday and Thursday. As per usual it is going to be in the Moscone conference center (the West conference center).

Kernel Filters in HTML+JavaScript

BalloonHeader

Download Code

Kernel filters are a common approach for modifying images for various applications of image processing. They can be used to sharpen an image, blur it, or extract attributes about a picture for further processing. Implementation of the filters is simple and straight forward. I wanted to do some experiments with Kernel filters on my phone. But to my surprise the available options were not many. I decided to make my own. Before developing something for my phone I started off from a browser since my Chromebook was handy. Here I’m sharing the results.

What is a Kernel

Kernels are known by many names. Kernel, convolution matrix, and mask all refer to the same thing. Convolution is the process of adding together the values of neighboring elements of an image and applying some weight to each of the pixels. The weights, or kernel, are often expressed using matrix notation. For each one of the pixels in an image the kernel is applied to the pixel and it’s neighboring pixels to determine the new intensity for the pixel.

Manipulating Images in HTML and JavaScript

In HTML and JavaScript the image element doesn’t give direct access to its pixels for manipulation. Instead the canvas element can be used to read and write pixels. Well, not directly. With the canvas element there’s a method named getPixelData() that will return a structure that has a number array of the intensities of the pixels. After manipulation of the elements the the result can be copied back into the canvas with putPixelData().

Visually we see the pixel data as being organized in rows and columns. In memory it is organized in a single dimensional array. To read and write the correct pixel you’ll need to know how it’s organized. A single pixel is composed of 4 numbers; 3 of the numbers are for intensities of red, green, and blue and the fourth number is for transparency. These 4 elements make up a single pixel. Pixel data is usually saved continuously starting with the upper left most pixel of an image as the first one to be encoded and moving to the right from there. Once the end of a row is reached the encoding continues starting with the left most pixel on the next row.

Pretend that you had an image that was 10 pixels wide and 10 pixels tall. If you wanted to read from the pixel on the third row and fourth column (keeping in mind that zero based addressing is being used) we would need at least 20 pixels into the array to get to the third row and then another 3 more pixels to get to the fourth column. In other words we need to read the twenty third pixel. Since pixels are composed of four elements this works out to the reading starting with index 92 of the array to get the red portion of the pixel and indices 93, 94, and 95 to read the green, blue, and transparency portions. Given an X and Y coordinate the equation for determining what address to start reading at is as follows.

PixelIndex = (y*imageWidth+x)*4;

Since the application of the kernel can overlap with pixels that are outside the range of the image I needed to decide how to deal with attempts to read pixels that are outside of range. I could have a constant value returned (like zero for all elements), have the read address wrap around to the other side of the image, or I could cap the read coordinates. I chose to cap the read coordinates. Attempts to read a coordinate that is less than zero will result in coordinate being changed to zero. An attempt to read beyond the edge of the image results in the edge of the image being read.

I’ve covered enough theory for us to build our first kernel filter in JavaScript. Now to get to building. Kernel filters are arrays of multipliers. They can be of any dimension. The basic pieces of information that we’ll need are the dimensions of the kernel and an array holding the values for each element of the kernel. We also need to mark which position in a filter represents the center pixel.

function kernel(width, height, centerX, centerY) {
	this.width = width;
	this.height = height;
	this.centerX = centerX || Math.floor(height/2);
	this.centerY = centerY || Math.floor(width/2);
	this.weightArray = [];
	for(var h=0;h<height;++h) {
		this.weightArray.push([]);
		for(var w=0;w<width;++w) {
		 	this.weightArray[h].push(0);
		}
	}
}

Given an image we need to get the image data from the image into the canvas. The canvas has a method named drawImage that will do this.

var width  = imageElement.naturalWidth;
var height = imageElement.naturalHeight;   
var canvas = $('')[0];
var ctx = canvas.getContext('2d');
ctx.drawImage(img,0,0);
var image = ctx.getImageData(0,0,width,height);
var pix = image.data;

To apply the filter, we will need to have a structure that contains the source data and another for writing the results. The results cannot be written to the same structure that we are reading from as this would overwrite some of the pixels that still need to be read for other processing.

var getPix = function(x,y) {
      x = Math.max(0, Math.min(x, width -1));
      y = Math.max(0, Math.min(y, height-1));
      var address = (y*width+x)*4;
      return [pix[address+0], pix[address+1], pix[address+2], pix[address+3]];
    }
    
    var getFilteredPix = function(x,y, kernelFilter) {
      var retVal = [0,0,0,0];
      for(var fy=0;fy<kernelFilter.height;++fy) {
        for(var fx=0;fx<kernelFilter.width;++fx) {
          var m = kernelFilter.weights[fy][fx];
          var pix = getPix(x+fx-kernelFilter.centerX, y+fy+-kernelFilter.centerY);
          retVal[0]+=pix[0]*m;
          retVal[1]+=pix[1]*m;
          retVal[2]+=pix[2]*m;
          retVal[3]+=pix[3];
        }
      }
      return retVal;
    }
    
    for(var yp=0;yp<height;++yp) {
      for(var xp=0;xp<width;++xp) {
        var newVal = getFilteredPix(xp,yp);
        var address = (yp*width+xp)*4;
        resultPixelData[address+0] = newVal[0];
        resultPixelData[address+1] = newVal[1];
        resultPixelData[address+2] = newVal[2];
        resultPixelData[address+3] = newVal[3]
      }
    }

With that in place we can now view the results of various kernel filters. Using the same source image here are a few filters and the result of them being applied. This is the original image that I’ll be working with.

balloon

Identity

0 0 0
0 1 0
0 0 0

As suggested by the name the identity filter does not result in any change on
the image, much like other identity operations in math like adding 0 to a number
or multiplying and dividing by 1.

balloon

Edge Detection

-1 -1 -1
-1 8 -1
-1 -1 -1

The edge detection filter highlights high contrast areas of an image resulting in lines
showing where these areas meet. If you wanted to produce an outline of a subject this
would be one of your go-to filters.
balloonEdge

 

Emboss

-2 -1 0
-1 1 1
0 1 2

The Embose filter produces an image with a 3d effect making it look like the image has been pressed into a material. Various areas of the image will appear to be raised or depressed.

balloonEmbose

Box Blur

0.111 0.111 0.111
0.111 0.111 0.111
0.111 0.111 0.111

The Box Blur simply averages the pixels in an area together. Here I show a 3×3 filter. For the image shown here I actually used a 10×10 filter for the sake of exagerating the effect to make it more visible here.

balloonBlur

This gives me something quick I can use for testing out image filters. It could be better though. Right now, to apply a different filter I need to modify code. Wouldn’t it be nice if the filter data were externalized allowing for filters to be saved and shared? I’ll look at that the next time I revisit this project.

GTX 1050, WDDM 2.2, and Windows Mixed Reality

I’ve got some Windows Mixed Reality Immersive headsets in hand. The experience is pretty cool. But I wanted to figure out what the minimum requirements are to use them so that we could get new hardware for some of the other developers. Microsoft has minimum requirements listed on a page. Not being one to take such a thing on word value (especially not for a new product) I decided to validate these requirements. The item I was questioning was the video card. The requirements list the NVidia GTX 1050 as the minimum video card. I made my way over to my local Best Buys and picked one up.

It was installed into a Machine that already had the Windows 10 Creator’s Update on it. When I started the Mixed Reality application I got the following.

CantRunMixedRality

I tried several driver versions from the ones released in April (version 381.65, which were the first to have VR support) to the most recent at the time of this writing (385.28).

Digging a little deeper I received a rather cryptic message from the NVidia GeForce software on Virtual Reality support. The software told me that this video card didn’t meet requirements for Virtual Reality. I needed to have at least an NVidia GTX 1050, and the card in the machine was only a NVidia GTX 1050. That’s not a typo, it showed the same card for both the required minimum and what was installed. I get the impression that there was the intention to support VR in this card but it just never happened.

As of yet the consumer release of the Mixed Reality features has not occurred. We are still in a time frame in which things could change rapidly. This card might be supported by then. From some exchanges with others though of you are looking to get a card that supports the Windows Mixed Reality headsets start of with NVidia’s GTX 1060 as a minimum.