Sammlung von Newsfeeds

Vectors from coarse motion estimation

Raspberry Pi -

Liz: Gordon Hollingworth, our Director of Software, has been pointing the camera board at things, looking at dots on a screen, and cackling a lot over the last couple of weeks. We asked him what he was doing, so he wrote this for me. Thanks Gordon!

The Raspberry Pi is based on a BCM2835 System on a Chip (SoC), which was originally developed to do lots of media acceleration for mobile phones. Mobile phone media systems tend to follow behind desktop systems, but are far more energy efficient. You can see this efficiency at work in your Raspberry Pi: to decode H264 video on a standard Intel desktop processor requires GHz of processing capability, and many (30-40) Watts of power; whereas the BCM2835 on your Raspberry Pi can decode full 1080p30 video at a clock rate of 250MHz, and only burn 200mW.

Because we have this amazing hardware it enables us to do things like video encode and decode in real time without actually doing much work at all on the processor (all the work is done on the GPU, leaving the ARM free to shuffle bits around!) This also means we have access to very interesting bits of the encode pipeline that you’d otherwise not be able to look at.

One of the most interesting of these parts is the motion estimation block in the H264 encoder. To encode video, one of the things the hardware does is to compare the current frame with the previous (or a fixed) reference frame, and work out where the current macroblock (16×16 pixels) best matches the reference frame. It then outputs a set of vectors which tell you where the block came from – i.e. a measure of the motion in the image.

In general, this is the mechanism used within the application motion. It compares the image on the screen with the previous image (or a long-term reference), and uses the information to trigger events, like recording the video or writing a image to a disk, or triggering an alarm. Unfortunately, at this resolution it takes a huge amount of processing to achieve this in the pixel domain; which is silly if the hardware has already done all the hard work for you!

So over the last few weeks I’ve been trying to get the vectors out of the video encoder for you, and the attached animated gif shows you the results of that work. What you are seeing is the magnitude of the vector for each 16×16 macroblock equivalent to the speed at which it is moving! The information comes out of the encoder as side information (it can be enabled in raspivid with the -x command line argument). It is one integer per macroblock and is ((mb_width+1) × mb_height) × 4 bytes per frame, so for 1080p30 that is 120 × 68 × 4 == 32KByte per frame. And here are the results. (If you think you can guess what the movement you’re looking at here represents, let us know in the comments.)

Since this represents such a small amount of data, it can be processed very easily which should lead to 30fps motion identification and object tracking with very little actual work!

Go forth and track your motion!

Is the toilet free?

Raspberry Pi -

Here at Pi Towers, we are lucky enough to have more toilets than we have people. Some offices don’t. And it’s embarrassing to hear your colleagues micturating (at least for some people – the rest of us chatter through it all and make fun of each other’s shy bladders), so the guys at Made by Many have come up with a Pi-based solution.

It started quite simply. Reed switches on a toilet door would send information to a Pi, which would publish the data to a website, so the folks at Made by Many could check online before going to the loo. They made a LEGO prototype to make sure everything worked.

And after applying the switches to the real toilet doors, they ended up with the real thing serving up a result like this when the website was polled.

Of course, it’s axiomatic that if you can overcomplicate something, you should.

So the Made by Many team started looking at what data they could collect without invading people’s lavatorial privacy (with a privacy document being uploaded to GitHub). No identifying information or information about exactly what was going on in the cubicle was collected at any time.  Over three weeks they ended up with sufficient data points to work some SQL magic and be able to detect:

  • if the toilets are free
  • the total number of visits
  • minimum visit duration
  • maximum visit duration
  • average visit duration
  • total visits by hour
  • total visits by day

From which they could infer:

  • the office’s favourite toilet
  • peak times
  • off-peak times
  • an estimated wait time.

And then they made a command-line-style stats page.

And because a job half-done is no job at all, they also made a little toilet notifier to live in the menu bar in Mac OS.

 

They’ve made LED signs. They’ve irritated their colleagues so much that one of them dismantled and abducted one of the reed switches. They’ve demonstrated elegantly that the Internet of Things is always informative, and not always as useful as we think it is. We think this is one of the most entertaining projects we’ve seen in a while. We salute you, Made by Many. And if you’ll excuse me, I drank rather too much coffee after lunch. I’ll just be a minute.

How To Use Gnuplot To Graph Data On The Raspberry Pi

Raspberry Pi Spy -

I was recently testing a sensor which needed calibrating. This involved plotting some data and making some adjustments based on the resulting graph. As a Windows user this is a task I would normally perform in Microsoft Excel or LibreOffice Calc.

In this case I decided to try to do it on the Pi given I was already working wthin the LXDE environment. Could I do some simple plots without getting frustrated with tons of obscure command line syntax?

The answer was yes and came in the form of “gnuplot”, a command-line driven graphing utility. It’s got a lot of options but it only takes five minutes to master the basics.

This tutorial just scratches the surface but aims to provide a quick reference for creating graphs from simple datasets.

Install gnuplot

To install gnuplot on the Raspberry Pi use the following command :

sudo apt-get install gnuplot-x11

You may have to answer “Y” if prompted.

Generate Some Example Data

In order to do the example plots I needed some test data. Here is a simple Python script to create some test data. It creates a plain text file called “mydata.dat”. Each line contains a set of data points where each number is separated with a space character.

#!/usr/bin/python import math f = open('mydata.dat', 'w') # Loop for degrees in range(720): # Generate three data points si = math.sin(math.radians(degrees)) co = 0.5 * math.cos(math.radians(degrees)) if si>0: sq = 0.6 else: sq = -0.6 # Write 3 data points to text file data = "{} {} {} {}\n".format(degrees,si,co,sq) f.write(data) f.close()

You can download this script directly to your Pi using :

wget https://bitbucket.org/MattHawkinsUK/rpispy-misc/raw/master/gnuplot/gnuplot_generate_data.py

Run it using :

sudo python gnuplot_generate_data.py

The script will create “mydata.dat” and the contents will look a bit like this :

0 0.00 0.50 -0.60 1 0.02 0.50 0.60 2 0.03 0.50 0.60 3 0.05 0.50 0.60 4 0.07 0.50 0.60 5 0.09 0.50 0.60 6 0.10 0.50 0.60 7 0.12 0.50 0.60 8 0.14 0.50 0.60 9 0.16 0.49 0.60 10 0.17 0.49 0.60 ....

The first column is just a number sequence. The other three columns are our data. If you were plotting temperature you may only have 1 column of data. I used three for this tutorial to make the example plots a bit more interesting.

Plot The Data

If you haven’t already launch the graphical environment by typing :

startx

You can launch gnuplot by either typing “gnuplot” in a terminal window or using the shortcut under XXXXXX. You will be presented with a command prompt awaiting your instructions.

To plot data you can enter :

plot "mydata.dat"

This only plots the data from the 2nd column. To plot the two other sets you can type :

plot "mydata.dat" using 1:2, "mydata.dat" using 1:3, "mydata.dat" using 1:4

This tells gnuplot to plot three sets of data using columns 2, 3 and 4.

Many of the customisations you can make to your graph are either made by adding parameters to the “plot” command or issuing “set” commands.

Setting a Plot Title and Axis Labels

To change the plot title you can type the following command :

set title "Example Plot"

To change the data labels you can modify your plot command :

plot "mydata.dat" using 1:2 title "Sin", "mydata.dat" using 1:3 title "Cos"

This can be abbreviated to :

plot "mydata.dat" u 1:2 t "Sin", "mydata.dat" u 1:3 t "Cos"

To change the axis labels you can use the following commands :

set xlabel "Minutes (mins)" set ylabel "Temperature (degrees)"

Lines and Points

You can also change the way the data points are represented on the graph. By default points are used. You can change the style to “lines” or “linespoints” using the “with” keyword :

plot "mydata.dat" u 1:2 t "Sin", "mydata.dat" u 1:3 t "Cos" with lines

This makes the square wave look nicer.

 Colours

It’s easy to change the colour of your data. Just use the “lt” paramenter :

plot "mydata.dat" using 1:2 lt rgb "blue", "mydata.dat" using 1:3 lt rgb "violet"

You can use a range of colours including black, red, green, blue, magenta, cyan, brown and light red.

Customising the Axis Scale

By default the axis will autoscale. Most of the time this is fine but you may want to tweak the axis values to make it look a bit nicer. You can either use :

set yrange [-1.5:-1.5]

or modify your plot command :

plot [] [-1.5:1.5] "mydata.dat" u 1:2 t "Sine", "mydata.dat" u 1:3 t "Cosine"

In our example we modified the y-axis and this would give us a plot like this :

You can reset the axis auto-scaling with “set autoscale”.

Here is a final example :

plot [] [-1.5:1.5] "mydata.dat" u 1:2 t "Sine" with lines lw 2, "mydata.dat" u 1:3 t "Cosine" with lines lw 2, "mydata.dat" u 1:4 t "Square" with lines lw 2

In this final example I’ve changed the data to use “lines” and added a”lw” parameter to increase the line width to 2. Modifying the y-axis has also ensured the key doesn’t clash with the data lines :

Other Tips

When you’ve changed settings using the “set” commands you can quickly re-draw your graph using “replot” rather than using the complete plot command. Don’t forget to use the up and down arrow keys to recall previous commands. That saves a lot of typing!

gnuplot can do lots, lots more but hopefully the information here is enough to get you started.

Spend £50 on stuffs and get a free limited-edition Pibow!

Pimoroni -

As part of our 100,000th Pibow celebrations we've made a batch of limited-edition Pibows called Sapphire, Emerald, and Ruby to give away!

These little gems aren't available for sale and will give your Raspberry Pi a truly unique look!

Introducing Pibow Emerald, Sapphire, and Ruby

Each limited-edition case is crafted out of nine unique translucent layers laser cut from colourful high-quality cast acrylic and once stacked they securely contain a Raspberry Pi while leaving the primary ports, including the CSI camera port, accessible.

Weighing only 92 grams the case is lightweight and ideal for mounting to any surface. Held together by nylon bolts no tools are required for assembly or dissasembly.

  • Limited-edition case not available for sale
  • Slim profile
  • Clear top and base to leave Raspberry Pi visible
  • Etched port markings
  • Lightweight
  • High-quality cast acrylic
  • Protects your Raspberry Pi
  • Leaves primary ports accessible
  • Colourful, durable, and most of all fun!
  • No tools required
How do I get one?

If you place an order containing items worth at least £50 before Sunday (27th April) on the Pimoroni Shop we'll include a free Pibow Sapphire, Ruby, or Emerald (make your choice during checkout) while stocks last!

Free limited-edition Pibow with orders over £50

Pimoroni -

As part of our 100,000th Pibow celebrations we've made a batch of one-off limited-edition Pibows called Sapphire, Emerald, and Ruby to give away!

These little gems aren't available for sale so will give your Raspberry Pi a truly unique look!

Introducing Pibow Sapphire, Emerald, and Ruby

Each limited-edition case is crafted out of nine unique translucent layers.

Each layer is laser cut from colourful high-quality cast acrylic and once stacked they securely contain a Raspberry Pi while leaving the primary ports, including the CSI camera port, accessible.

Weighing only 92 grams the case is lightweight and ideal for mounting to any surface. Held together by nylon bolts no tools are required for assembly or dissasembly.

  • Limited-edition case not available for sale
  • Slim profile
  • Clear top and base to leave Raspberry Pi visible
  • Etched port markings
  • Lightweight
  • High-quality cast acrylic
  • Protects your Raspberry Pi
  • Leaves primary ports accessible
  • Colourful, durable, and most of all fun!
  • No tools required
How do I get one?

If you place an order containing items worth over £50 before Sunday (27th April) on the Pimoroni Shop we'll include a free Pibow Sapphire, Ruby, or Emerald (make your choice during checkout) while stocks last!

Sapphire, Ruby, and Emerald limited-edition

Pimoroni -

As part of our 100,000th Pibow celebrations we've made a batch of one-off limited-edition Pibows to give away!

These little gems aren't available for sale anywhere so will give your Raspberry Pi a truly unique look!

Introducing Pibow Sapphire, Emerald, and Ruby

How do I get one of those stunners?!

If you place an order containing items worth over £50 before Sunday (27th April) on the Pimoroni Shop we'll include a free Pibow Sapphire, Ruby, or Emerald while stocks last!

During checkout you'll be asked which model you want, pick the right option and await your limited-edition Pibow glory!

Free goodies for good causes from Pimoroni

Raspberry Pi -

Our  friends at Pimoroni have some good news for you. To celebrate making their 100,000th Pibow case, they’re giving away 512 Pibow Rainbow cases (and some accessories) to good causes.

The Pibow Rainbow – Raspberry Pi case and thing of beauty

Are you a charity, educational establishment or other worthy cause with a bunch of naked Model B Raspberry Pis? Maybe you’re such a place and you want to buy a bunch of Pis with a free case, or upgrade to something a bit more shiny?

All you need to do is comment below with a valid email address, or email support@pimoroni.com with the subject “WE NEED FREE PIBOWS”.

Say briefly who you are (School, Charity, Good Cause), what you do, and why a classroom kit would be really useful to you. Each kit contains 10 lovely Pibow Rainbow cases (or more!) plus a PiHub, Pibrella and PiGlow to play with. Here’s a video of a PiGlow doing its thing to whet your appetite – you’ll find a tutorial in our Resources section to get you programming yours using Python in easy steps.

Paul, who is half of Pimoroni and who also designed the very fruity Raspberry Pi logo, says:

“We love the things people do with the Pi and Pibow already, and now seems like a perfect time for us to spread a bit of colour and joy to the places where the Pi makes the most difference. Learning about computers, electronics and other geekery should be fun and friendly and for everyone.”

Using A Joystick On The Raspberry Pi Using An MCP3008

Raspberry Pi Spy -

While browsing eBay looking at electronics stuff I found a few interesting items to connect to the Pi. The first item was a small 2-axis analogue joystick. They are similar to the thumb-sticks you would find on a modern games console controller. These modules are cheap and easy to connect to a circuit so I decided to get one. The outputs are analogue so you need a mechanism for the Pi to read these voltages.

In this post I’ll show how you can you use this device with the Pi. Once working this could be used as an input device for all sorts of projects. Perhaps a Python game written using the Pygame module?

The device I bought was labelled “Keyes_SJoyes”. It consists of two potentiometers which give an analogue voltage based on the horizontal and vertical position of the thumb-stick. Pressing the stick activates a small switch. There are no fancy components on the board and because it is really just two variable resistors works fine with 3.3V despite the 5V PCB label.

In order to measure the X and Y voltages I decided to use an MCP3008 10 bit Analogue to Digital Converter. These devices are cheap, easy to setup and allow 8 analogue inputs to be read by the Pi using it’s SPI interface. In this tutorial we will only need three of its inputs.

See my previous MCP3008 post for details of how I used one to read light levels and temperature.

Breadboard Circuit

Here is my test circuit. The pin-out of my joystick is slightly different to the Sparkfun symbol I used in this diagram but the wire colour coding matches the photos.

Here is the wiring information for the joystick module :

Joystick Pi/MCP3008 Wire Colour -------------- ---------------------- ---------------- GND (Ground) Pi GPIO Pin 6 (Ground) Black 5V (3.3V) Pi GPIO Pin 1 (3.3V) White SW (Switch) MCP3008 Pin 1 (CH0) Purple VRx (X voltage) MCP3008 Pin 2 (CH1) Blue VRy (Y voltage) MCP3008 Pin 3 (CH2) Green

The MCP3008 is wired up just as it was in my previous post :

MCP3008           Pi                 Wire Colour -------------- ---------------- ----------- Pin 1 (CH0) - Purple Pin 2 (CH1) - Blue Pin 3 (CH2) - Green Pin 9 (DGND) Pin 6 (Ground) Black Pin 10 (CS) Pin 24 (GPIO8) Orange Pin 11 (DIN) Pin 19 (GPIO10) Yellow Pin 12 (DOUT) Pin 21 (GPIO9) Green Pin 13 (CLK) Pin 23 (GPIO11) Blue Pin 14 (AGND) Pin 6 (Ground) Black Pin 15 (VREF) Pin 1 (3.3V) Red Pin 16 (VDD) Pin 1 (3.3V) Red

In this case we are using three of the analogue inputs. You could read the Switch value using a normal GPIO pin but in this case I decided to use an analogue input for convenience.

The 10K resistor is used to pull the switch input High (3.3V). When the switch is pressed the input is connected to ground (0V). Without the resistor the input would be in an undefined state when the switch wasn’t being pressed and read random values. Give it a try.

Pi SPI Configuration

In order to use the MCP3008 we need to configure the SPI bus on the Pi first. Rather than repeat the instructions here open the Analogue Sensors On The Raspberry Pi Using An MCP3008 tutorial in a new browser window and complete the sections :

  • Enable Hardware SPI
  • Install Python SPI Wrapper
Python Test Script

Hopefully if you’ve wired it up correctly and got the SPI interface configured we are ready to run a Python script to read the joystick values.

The ADC is 10-bit so it can report a range of numbers from 0 to 1023 (2 to the power of 10). A reading of 0 means the input is 0V and a reading of 1023 means the input is 3.3V. In our circuit the switch will read 3.3V (1023) until it is pressed when it will read 0V (0). The X and Y joystick values will vary between 0 and 1023 as they are moved from one extreme to another. In the centre position we would expect a a value of 511.5. In reality this is going to vary between 509 and 514.

#!/usr/bin/python #-------------------------------------- # This script reads data from a # MCP3008 ADC device using the SPI bus. # # Analogue joystick version! # # Author : Matt Hawkins # Date : 17/04/2014 # # http://www.raspberrypi-spy.co.uk/ # #-------------------------------------- import spidev import time import os # Open SPI bus spi = spidev.SpiDev() spi.open(0,0) # Function to read SPI data from MCP3008 chip # Channel must be an integer 0-7 def ReadChannel(channel): adc = spi.xfer2([1,(8+channel)<<4,0]) data = ((adc[1]&3) << 8) + adc[2] return data # Define sensor channels # (channels 3 to 7 unused) swt_channel = 0 vrx_channel = 1 vry_channel = 2 # Define delay between readings (s) delay = 0.5 while True: # Read the joystick position data vrx_pos = ReadChannel(vrx_channel) vry_pos = ReadChannel(vry_channel) # Read switch state swt_val = ReadChannel(swt_channel) # Print out results print "--------------------------------------------" print("X : {} Y : {} Switch : {}".format(vrx_pos,vry_pos,swt_val)) # Wait before repeating loop time.sleep(delay)

You can download this script directly to your Pi using :

wget https://bitbucket.org/MattHawkinsUK/rpispy-misc/raw/master/mcp3008/mcp3008_joystick.py

This can then be run using :

sudo python mcp3008_joystick.py

If everything has worked correctly you should see an output that looks something like :

The switch reading varies but is always >1010 when not pressed and <10 when pressed.

As you move the thumb-stick you should see the X and Y values changing. You can reduce the value of the “delay” variable to increase the update rate.

Now that you can read values from a joystick you just need to think of a project to use it in! You could add an additional module and use another three channels on the MCP3008.

Here are some photos of the test circuit and the thumb-stick joystick module :

You may notice my breadboard in the photos has a few extra wires on it (long red wire, long blue wire and bent black wire). These were left over from previous MCP3008 tutorials and can be ignored.

Here are some other blog posts I found using an analogue 2-axis joystick with a Raspberry Pi :

http://devilqube.blogspot.co.uk/2014/02/analog-thumbstick-and-raspberry-pi.html

https://learn.adafruit.com/cupcade-raspberry-pi-micro-mini-arcade-game-cabinet

Preview the upcoming Maynard desktop

Raspberry Pi -

Some of you will be aware that we’ve been working on a new, more responsive and more modern desktop experience for the Raspberry Pi. We thought you might like an update on where we are with the project.

The chip at the heart of the Raspberry Pi, BCM2835, contains an extremely powerful and flexible hardware video scaler (HVS), which can be used to assemble a stack of windows on the fly for output to the screen. You can see this as being a much more complicated version of the sprite capabilities you may remember from 8- and 16-bit computers and games consoles from the Commodore 64 onward.

The Wayland compositor API gives people like us a way to present the HVS to applications in a standards-based way. Over the last year we’ve been working with Collabora to implement a custom backend for the Weston reference compositor which uses the HVS to assemble the display. Last year we shipped a technology demonstration of this, and we’ve been working hard since then to improve its stability and performance.

The “missing piece” required before we can consider shipping a Wayland desktop as standard on the Pi is a graphical shell. This is the thing that adds task launching and task switching on top of the raw compositor capabilities provided by Wayland/Weston. The LXDE shell we use under X on the Pi doesn’t support Wayland, while those shells that do (such as Gnome) are too heavyweight to run well on the Pi. We’ve therefore been working with Collabora since the start of the year to develop a lightweight Wayland shell, which we’ve christened Maynard (maintaining the tradition of New England placenames). While it’s some distance from being ready for the prime time, we though we’d share a preview so you can see where we’re going.

Packages for Raspbian are available (this is a work in progress, so you won’t be able to replace your regular Raspbian desktop with this for general use just yet, and you’ll find that some features are slow, and others are missing). Collabra have made a Wiki page with compilation instructions available: and there’s a Git repository you can have a poke around in too.

Mudra: a Braille dicta-teacher

Raspberry Pi -

Sanskriti Dawle and Aman Srivastav are second-year students at the Birla Institute of Technology and Science in Goa. After a Raspberry Pi workshop they decided they wanted to do something more meaningful than just flash LEDs on and off, and set this month’s PyCon in Montreal as their deadline.

Aman Srivastav and Sanskriti Dawle

They ended up producing something really special. Mudra means “sign” in Sanskrit: the Raspberry Pi-based device is a learning tool for visually impaired people, which teaches Braille by translating speech to Braille symbols. Braille literacy among blind people is poor even in the developed world: in India, it’s extremely low, and braille teachers are very, very few. So automating the teaching process – especially in an open and inexpensive way like this – is invaluable.

In its learning mode, Mudra uses Google’s speech API to translate single letters and numbers into Braille, so learners can go at their own speed. Exam modes and auto modes are also available. This whole video is well worth your time, but if you’re anxious to see the device in action, fast-forward to 1:30.

Sanskriti and Aman say:

Mudra is an excellent example of what even programming newbies can achieve using Python. It is built on a Raspi to make it as out-of-the-box as possible. We have close to zero coding experience, yet Python has empowered us enough to make a social impact with Mudra, the braille dicta-teacher, which just might be the future of Braille instruction and learning.

We think Mudra’s a real achievement, and a great example of clean and simple ideas which can have exceptional impact. You can see the Mudra repository on GitHub if you’d like a nose around how things work; we’re hoping that Sanskriti and Aman are able to productise their idea and make it widely available to people all over the world.

Books, the digitising and text-to-speechifying thereof

Raspberry Pi -

A couple of books projects for you today. One is simple, practical and of great use to the visually-impaired. The other is over-complicated, and a little bit nuts; nonetheless, we think it’s rather wonderful; and actually kind of useful if you’ve got a lot of patience.

We’ll start with the simple and practical one first: Kolibre is a Finnish non-profit making open-source audiobook software so you can build a reader with very simple controls. This is Vadelma, an internet-enabled audio e-reader. It’s very easy to put together at home with a Raspberry Pi: you can find full instructions and discussion of the project at Kolibre’s website.

The overriding problem with automated audio e-readers is always the quality of the text-to-speech voice, and it’s the reason that books recorded with real, live actors reading them are currently so much more popular; but those are expensive, and it’s likely we’ll see innovations in text-to-speech as natural language processing research progresses (its challenging: people have been hammering away at this problem for half a century), and as this stuff becomes easier to automate and more widespread.

How easy is automation? Well, the good people at Dexter Industries decided that what the Pi community (which, you’ll have noticed, has a distinct crossover with the LEGO community) really needed was a  robot that could use optical character recognition (OCR) to digitise the text of a book, Google Books style. They got that up and running with a Pi and a camera module, using the text on a Kindle as proof of concept pretty quickly.

But if you’re that far along, why stop there? The Dexter team went on to add Lego features, until they ended up with a robot capable of wrangling real paper books, down to turning pages with one of those rubber wheels when the device has finished scanning the current text.

So there you have it: a Google Books project you can make at home, and a machine you can make to read the books to you when you’re done. If you want to read more about what Dexter Industries did, they’ve made a comprehensive writeup available at Makezine. Let us know how you get on if you decide to reduce your own library to bits.

New Raspbmc update!

Raspbmc -

Hi!

There’s a lot that’s new — after all, it has been three months since the last update. The update before the New Year put the project in solid standing and I felt that the project was maturing. Rather than release small incremental updates each month, I decided to let things rest a while. There wasn’t that much to fix or push — so I took a step back.

In the past couple of months, I’ve been working on a few things, including working with two hardware developers to establish a reference platform for the upcoming linXBMC project, speaking to a prominent Internet streaming company about adding their service in a less ‘hacky’ way and trying to get more resources for the upcoming project. More will be revealed on that soon.

As we get increasingly near to the release of XBMC 13 (Gotham), I’ve done the following:

  • I’ve cleaned up the nightly builds list. Although there were plenty of builds available, it was quite messy, and users were not sure why they should try one build over another.
  • I’m producing 24 hour nightly builds of XBMC 14.0
  • I’ve published all XBMC 13.x Betas — which are installable via Raspbmc Settings
  • I have now prepared all the patches for XBMC 13 (Gotham), meaning that upon its announcement by Team-XBMC as final, I will release a build for Raspbmc a few hours later as an update.
  • Those wishing to stay with Frodo will not be left in the dark however. If you’d like to stay with Frodo, perhaps because it’s tried and tested, or perhaps because you have a shared library and you need to stay on the 12.x series, then not to worry. I have made a stable 12.3 build and that’s pushed as an update today. Even when Raspbmc moves to Gotham, this Frodo build will be kept available to install via Raspbmc Settings.

Here’s what’s new to Raspbmc as a whole:

  • Updated build filesystem to satisfy new XBMC build dependencies and fix a locales issue
  • Fix an overclock setting for ‘Fast’ mode that would force a high (and potentially incompatible) PLL divisor
  • Allow XBMC to adjust task priority for improved playback performance
  • Fix for the Heartbleed vulnerability. Note that this affects both clients as well as public facing servers, so fixing this issue was important.
  • Fix a bug where playback fails when accessing files from WebDAV or HTTPS shares
  • Firmware is updated to resolve issues with CEC on Panasonic sets and bring improvements to playback
  • Updated the standalone image to the latest version of XBMC, kernel and firmware

Here’s what’s new, thanks to XBMC Gotham will bring the following features and improvements:

  • Issues streaming with iOS 7 using AirPlay are now fixed completely
  • In the past couple of months, some new sound cards for Raspberry Pi have come out, so I’m adding support for the following sound cards:
    • Wolfson Microelectronics Raspberry Pi Module – Wolfson’s patches for this had issues, so I’ve done my best to manually resolve these myself. I have reached out to a developer at Wolfson who tells me patches will be released in the future.
    • HiFiBerry sound cards
    • IqAudio sound card
  • Add ALSA support to XBMC Frodo without need for manually enabling in Raspbmc Settings. This approach is done with ‘dvdplayer’ rather than an OpenMAX ALSA component.
  • Improved JPEG to texture decoding (thanks Ben Avison)
  • Hardware accelerated resampling and downmixing (thanks Dom)
  • dvdplayer with OMXPlayer acceleration:
    • this provides full DVD menu support and is suitable for playing back most content. To use ‘dvdplayer’ instead of the standard omxplayer, you need to select ‘Play with’ which can be done by invoking the context menu on the file that you would like to play. This is necessary for sound output with ALSA. omxplayer is being kept as the default player as it is more capable of playing back HD content; dvdplayer with OMX acceleration falls down with Blu-ray playback.
  • I have added support for encrypted DVDs — and in turn, the ability to play straight from DVDs with an external drive
  • ALSA sequencer support added for external sound cards
  • Adjust read buffer factor for better buffering of content and less pausing during playback
  • Ensure the web server is on by default with no username necessary for XBMC Gotham — allowing the user to use their smartphone to control Raspbmc out of the box without additional input devices

To accelerate development on the new project, linXBMC, I’ll be holding a competition soon, stay tuned for an announcement! I think I’ll be changing the name soon, so that may give you a hint as to what the competition might involve! The new content delivery network is coming along soon, and I hope to make the switchover to the new system later this month. The Raspbmc Shop will offer international shipping by the end of the week and more competitive pricing too!

To get the update, all you need to do is reboot your Raspberry Pi. If you’re running an XBMC nightly, be sure to switch to ‘xbmc release’ in Raspbmc Settings to get back on the stable Frodo build. If you’d like to try the vanilla Gotham builds: they are installable via Raspbmc Settings; however I’d recommend the custom Raspbmc build ‘Gotham-Raspbmc-Release’ which has support for sound cards, DvdPlayer support and the JPEG texture handling improvements. The process for playing back with an external sound card is not yet streamlined (it will soon simply involve a Raspbmc Settings based checkbox to enable), so for now you should see this thread for information.

If you enjoy Raspbmc, and this update, and would like to support continued development, you can make a donation here.

As always, enjoy!

MagPi issue 22

Raspberry Pi -

I’m about two weeks late to the party on this one – massive apologies to all at The MagPi. It’s been a bit busy around here so far this month. Right now, Picademy’s underway in the office space we’ve got set up as a classroom, and 24 teachers are busy making blooping noises with Sonic Pi while Clive booms at them in Teachervoice. It’s distracting but curiously enjoyable.

Alongside the preparation for Picademy, this month we’ve seen the launch of this new website, and the announcement about the new Compute Module. While all this was going on, the April edition of The MagPi came out, and I didn’t notice because I was too busy glueing Raspberry Pi logos on sticks and sending boxes of jam to Johnny Ball (true story).

 As usual, The MagPi is full of wonderful things like internet-enabled garage doors, night lights that repel under-bed goblins, reviews, competitions, tutorials and much more. My favourite article this month discusses a solar cell (this month’s cover star) that tracks the sun to provide 140% more energy than a static cell. Go and read it online for free: you can also order a printed copy for your personal library or for your school. Thanks MagPi folks – I promise to be more timely about letting people know about next month’s issue!

Lumsing 11000mAh Li-on Battery Power Bank Test

Raspberry Pi Spy -

Over the last two years I’ve acquired a set of portable USB power banks. These are great for powering the Pi and the larger ones can keep a Pi running for many hours. In previous posts I’ve tested standard AA batteries,  a generic Li-on Power Pack and a RAVPower 10400mAH Power Bank.

Now it’s the turn of the Lumsing 11000mAH power bank. This device is slightly unusal in that it has 5 USB outputs. Yes five. This gives you the chance to run 5 devices from it. In a future post I’ll see what I can run off it but for this test I’ll just be using a single 1A rated port.

Lumsing Power Bank (PB-AS008)

The package contained the following items :

  • Power Bank unit
  • 2 USB-to-MicroUSB cables
  • Carrying pouch
  • Instruction manual

and has the following features

  • 11000mAh capacity
  • 40.7Wh
  • 5 USB output ports (0.5,1,1,1.3,2.1A)
  • 1 MicroUSB input port for charging
  • 4 stage blue LED status bar
  • On/Off switch

The battery had some charge in it when I first switched it on but I charged it fully before I attempted to use it. As it charged the blue LEDs showed the progress.

Test Setup

Once it was charged I got ready for the test. The Pi was setup with the following equipment :

  • Raspberry Pi (Rev 2)
  • BerryClip addon board
  • SD card with Raspbian
    “Wheezy” (2013_02_09)
  • Lumsing 11000mAh Power Bank
  • USB to MicroUSB cable
    (as supplied with the Power Bank)
  • Python script as used in my AA battery shootout post

The Pi was networked and I used Putty to connect to it from PC. This terminal was used to set off the Python script with the following command :

sudo python battery_uptime_test.py

The current uptime was updated in the Putty window and I just left it running. When the power runs out the time left in the Putty windows gives me the total uptime.

Predicted Battery Stamina

My eBay 12000mAh Power Bank lasted 18 hours 40 minutes (1120 minutes).

The RAVPower 10400mAh Power Bank lasted 17 hours 55 minutes (1075).

So a good guess would be (11000/10400)*1075 = 1137 minutes compared to the RAVPower’s slightly lower stated capacity. That would give us 18 hours and 57 minutes.

Results

The Lumsing pack lasted 19 hours 22 minutes (1162 minutes). This is almost 30 minutes longer than the prediction. The Pi isn’t doing much in this test but it showed the Lumsing was in the same league as the RAVPower.

The Lumsing 5-port 11000mAh External Backup Battery Pack is available from Amazon.com :

Lumsing 11000mAh 5 x USB External Battery Pack

Photos

Here is a set of additional photos I took of the battery pack and my Pi setup:

Seiten

RaspberryCenter.de Aggregator abonnieren