Quantcast
Channel: Android*
Viewing all 523 articles
Browse latest View live

Beacon Mountain for Linux* Release Notes

$
0
0

New Features

0.5.0

  • Android* Development Tools (ADT Bundle)
  • Android* NDK
  • Android* Design downloads
  • Intel® Graphics Performance Analyzers (Intel® GPA)
  • Intel® Threading Building Blocks (Intel® TBB)
  • Intel® Integrated Performance Primitives (Intel® IPP)
  • Samples

System Requirements

  • Supported on Ubuntu 12.04 with all available updates applied
  • Run Android emulators with the “Use Host GPU” emulation option turned off
  • Beacon Mountain
  • BeaconMountain
  • Developers
  • Linux*
  • Android*
  • Android* Development Tools
  • Development Tools
  • Phone
  • Tablet
  • URL

  • Meshcentral.com - Android Remote Desktop

    $
    0
    0

    In the latest version of the Mesh Agent for Android, Rick was also able to get remote desktop working. This allows you to take control of an Android device using Meshcentral.com web based "desktop" tab. Now, the thing is, you need to have the Android mesh agent running as root in the device to do this. This is not typical at all, but if you happen to have a rooted device and can install the mesh agent in this way, you get this extra bonus features!

    Below is a demonstration video of Android remote desktop on Youtube. The demonstration is on a small PC, but the feature works on tablets and phones. The process of performing the screen capture is not as fast as what is done on Windows or Linux, but Rick opted for a way that would be as compatible as possible with most devices on the market today.

    As usual, we are always looking for feedback.

    Enjoy!
    Ylian
    meshcentral.com

  • Mesh
  • MeshCentral
  • MeshCentral.com
  • android
  • Remote Desktop
  • desktop
  • kvm
  • Ylian St. Hilaire
  • Icon Image: 

  • Development Tools
  • Embedded
  • Mobility
  • Open Source
  • Android*
  • Business Client
  • HTML5
  • Laptop
  • Phone
  • Tablet
  • Desktop
  • Developers
  • Partners
  • Professors
  • Students
  • Android*
  • Optimizing H.265/HEVC Decoder on Intel® Atom™ Processor-Based Platforms

    $
    0
    0

    Introduction

    Watching video is the top usage for mobile devices. Multimedia processing is computing intensive usage and has a big impact on battery life and user experience. The LCD resolution on mobile devices has got better, from 480p to 720p, to now 1080p. End users want to watch high quality videos, but for online video providers, such as Youku, iQiyi and LeTV, purchasing the network bandwidth becomes increasingly expensive every year.

    H.265/HEVC (High-Efficiency Video Coding), introduced last year, is the latest video codec standard developed by ISO / IEC and ITU-T.  H.265/HEVC doubles the compression ratio compared to the previous H.264/AVC standard, but has the same subjective quality. HEVC technology helps online video providers to provide high-quality video with lesser bandwidth, making it the next video codec revolution.

    Android has a multimedia API, but ISVs often find it difficult to use and it doesn’t always meet their requirements. I will show a case study about how to optimize the H.265/ HEVC decoder for Intel® Atom™ processor-based platforms using YASM Modular Assembler and several Intel® software tools to obtain even better performance.

    Android multimedia codec introduction

    As we know, Android provides a standard multimedia player interface for developers to play video in the Java* layer. Media codec was introduced from Android 4.1 version. Developers can use media codec APIs to customize their players in the Java* layer. The Android multimedia workflow is shown below:

    Figure 1. Android* multimedia workflow

    MediaPlayerService will choose to use AwesomePlayer or Nuplayer based on video data source and format. AwesomePlayer supports local “fd://“ files and some “http://” URL-integrated streaming. NuPlayer was introduced in Android 4.0 to support streaming, which mainly supports “RTSP://” URLs and some “http://” (m3u8) segmented videos.

    AwesomePlayer is based on the TimedEventQueue model, while Nuplayer is based on ALooper, AHandler, and Amessage. So Nuplayer can provide quick response for playing streaming video. Acodec was introduced to support Nuplayer, which does not provide open APIs. Media codec will also call Acodec.

    Awesomeplayer, Nuplayer, and Acodec will directly call OMX codec, the new HW multimedia interface which seems isolated from Android. It is a pity that, OMX codec is not cross-platform compatible, but it does call the HW platform driver.

    For Android on Intel Atom processor-based platforms, we have to stay abreast with Google’s developments to optimize the standard multimedia players. So if you can use the standard Android multimedia player to play video on Intel Atom processor-based platforms, you can achieve good performance because the HW decoder is readily available on this platform.

    In the PRC, more than 20 multimedia apps are available in the Android market, as shown below:

    Figure 2: online video market in the PRC

    Although, Intel encourages ISVs to use the optimized Android multimedia players for best performance on Intel Atom processor-based platforms, but most online video players in the market do not adopt Android standard multimedia players; they prefer to use open source or develop their own codec.

    The reasons are as follows:

    1. Android standard multimedia player can only support MP4, 3GP formats, but can’t completely support many other popular formats, such as RM, RMVB, FLV, MKV, DIVX, WMV, etc. These formats have to be decoded by software codecs.
    2. There will be some compatibility problems when using the Android multimedia APIs to parse the streaming packages. ISVs usually claim that Android multimedia’s parsing solution is only a lab solution.
    3. ISVs also claim that Google updates the Android OS often, usually changing the Android multimedia APIs along with it. These changes force the ISVs to change their players and most ISVs are reluctant to do that.
    4. The APIs of standard Android multimedia players are not flexible, so ISVs have a difficult time using them to customize their players.

    Given these reasons, ISVs prefer to modify open source such as FFMPEG or develop their own codec to play videos. Because they don’t have much experience in optimizing these open source codecs on Android for Intel® Architecture (IA), ISVs have to adopt software decoder solutions to support all the video formats, which have higher CPU loads resulting in poor performance.

    We usually encourage ISVs to use the following tools to help optimize their online video players.

    • Yasm Modular Assembler.
    • Intel® C++ Compiler (Intel® ICC).
    • Intel® Streaming SIMD Extensions (Intel® SSE).
    • Intel® Threading Building Blocks (Intel® TBB).
    • Intel® Graphics Performance Analyzers (Intel® GPA).

    These tools allow ISVs to optimize the open source or their own SW codec on IA-based platforms and obtain good performance. We have optimized Youku, LeTV, and QQ video on Lenovo K900 for better performance. The CPU load of the same video decreased from an average 40% to 8% after optimization[1]. The ISVs and the OEMs are all satisfied with the performance.

    Case study for Optimizing H.265/HEVC player on Intel® Atom™ processor-based platforms

    Strongene is a Chinese company focusing on kernel video coding technology. It provides advanced H.265/HEVC encoder/decoder codecs, which have been adopted by Xunlei Kankan online video. Their encoder/decoder solution is integrated into FFMPEG open source for ISVs to use.

    We used Intel® Vtune™ tools to debug Strongene’s H.265/HEVC decoder. Then we optimized it using the toolsets as explained in the next three subsections. We obtained extreme decoding speed and low CPU occupancy on Intel Atom processor-based platforms.

    1. Optimized by YASM & Intel® C++ Compiler (Intel® ICC).

    Instead of compiling the optimized ASM assembly codes in open source FFMPEG with the default Android compiler, we use YASM and the Intel® ICC Compiler.

    YASM is a complete rewrite of the NASM assembler under the “new” BSD License, which can reuse the SIMD-optimized ASM assembly code for x86 platforms. Developers can download and install the YASM compiler from http://yasm.tortall.net. To use it, modify the configure.sh file to enable the YASM and ASM options before compiling FFMPEG, as shown below:

    Figure 3: Modify the FFMPEG configure file

    We also encouraged the ISVs to use Intel® ICC tool to compile the native code.

    2. Optimized with Intel® Streaming SIMD Extensions (Intel® SSE) instructions:

    Debugging with Intel® Vtune tools, we found that Strongene’s codec only used C code to realize YUV2RGB, the performance was not optimal.

    Intel Atom processor-based platforms support Intel SSE instruction codes, which includes MMX, MMXEXT, Intel SSE, SSE2, SSE3, SSSE3 and SSE4. Enabling Intel SSE code in open source FFMPEG can highly improve the YUV2RGB performance.

    We open the SSSE3 compiler option in the FFMPEG using MMX EXT code as shown in the code snippet below.

    Figure 4: Enable SSE code in the FFMPEG

    3. Optimized by Intel® Threading Building Blocks (Intel® TBB) tool:

    When we ran Intel® Vtune tool, we found that Strongene’s codec created four threads. However, the fastest thread had to wait for the slowest thread, creating idle cores.

    Intel®SSE can only work on a single core if used alone. Using Intel® TBB together with Intel® SSE can make the code run on multi-cores, improving performance.

    We modified their multi-threads codes to perform multi-tasks, then used Intel ®TBB tool to allocate the task to the idle cores, in order to fully utilize the multi-cores.

    Intel TBB can be downloaded from http://threadingbuildingblocks.org/download.

    H.265/HEVC Decoder Performance Comparison[2]

    By testing, we found that optimization by YASM& Intel® ICC tools can get up to 1.5x performance improvement, Optimization by Intel® SSE can get up to 6x performance improvement compared to C code And optimization by Intel® TBB can get up to 2.6x performance improvement.

    We used Intel® Graphics Performance Analyzers (Intel® GPA) tool to test the refresh rate when playing video. Without optimization, when playing the 1080p HEVC video, the average refresh rate on the Lenovo K900 was 11.7 FPS(Frame per Second) but after optimization by the above methods, the average refresh rate on the Lenovo K900 can reach upto 29.6 FPS on the same video.

    Figure 5: Performance comparison

    When tested with the optimized H.265/HEVC decoder on the codenamed Baytrail tablet, the performance is better than Lenovo K900 phone, the refresh rate can reach 52.6 FPS.

    If we set the refresh rate to 24 fps on the codenamed Baytrail tablet, when playing the 1080p video, the CPU workload is less than 35%. So we readily recommend Strongene’s HEVC decoder solution to the popular online video providers in the PRC for commercial use.

    Summary

    Multimedia apps are very popular on android phone and tablet, their performance is very important for user experience. There are a series of tools to optimize the performance of android apps on Intel® Atom™ processor-based platforms. The H.265/ HEVC decoder can be optimized by these tools to obtain even better performance on Intel® Atom™ processor-based platforms.

    Recommendations for ISVs in the new Android world:

    Don’t hesitate to optimize your multimedia apps by YASM/ Intel® ICC/ Intel® SSE/ Intel® TBB tools, these trusted tools can provide amazing performance boosts

    Reference

    1. http://www.strongene.com/en/homepage.jsp
    2. http://yasm.tortall.net
    3. http://threadingbuildingblocks.org

     


    [1]Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark* and MobileMark*, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products.
    Configurations: [describe config + what test used + who did testing]. For more information go to http://www.intel.com/performance

    [2]Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark* and MobileMark*, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products.
    Configurations: [describe config + what test used + who did testing]. For more information go to http://www.intel.com/performance

  • Developers
  • Android*
  • URL
  • App Privacy: Is it really an “anomaly”?

    $
    0
    0
    This week, Google's chief internet evangelist, VintCerf spoke on privacy at an FTC event, inferring that privacy might not necessarily be sustainable, especially in regards to apps and social networks:
     
    “Our social behavior is also quite damaging with regard to privacy," Cerf says. He gives an example how a person could be exposed doing something that they wanted to keep secret by being tagged in the background of a stranger's photo — a photo they never expected to be caught in. "The technology that we use today has far outraced our social intuition, our headlights. ... [There's a] need to develop social conventions that are more respectful of people’s privacy."
     
    "We are gonna live through situations where some people get embarrassed, some people end up going to jail, some other people have other problems as a consequence of some of these experiences," Cerf said. More respectful privacy conventions will likely develop as we move forward, he says, but for now, "This is something we're gonna have to live through. I don't think it’s easy to dictate this."– “Google’s chief internet evangelist says “privacy may actually be an anomaly”, ReadWriteWeb
     
    We’ve all probably experienced this lack of privacy on a social network – perhaps you were tagged in a photo from an event that you didn’t realize would be public, or perhaps information was shared that wasn’t exactly for public consumption. This kind of thing is becoming more commonplace, unfortunately, especially as privacy policies are changed often without keeping the consumer informed of changes that might impact their public-facing personas. 
     
    Privacy in regards to apps, especially when apps ask for information in order to function, is also starting to become a common issue, especially with the glut of apps out there, the amount of apps that consumers download on their devices, and the lack of oversight for app privacy. Most people actually do expect quite a bit of privacy out of their apps, even though for the most part, that privacy is perceived. According to a recent study:
     
    “46 percent, for example, believe that carriers should not store location information for any length of time at all, while 59 percent believe data on a phone is "about as private" as data on a personal computer — which isn't necessarily the case depending on how a phone is loaded up.”
     
    Is app privacy an illusion – an “anomaly”, as Mr. Cerf suggests in his talk referenced above? After all, we give data to our favorite social networking sites which then use that data to find friends, events, and organizations for us to continue to interact with. When we use a certain large search engine along with its peripheral services, we are essentially giving it the “key to the castle” with how much data we’re allowing it to see and use. When we go shopping, the pair of boots that we liked is going to show up on a popup ad sometime in the future of our Web browsing. This sounds potentially intrusive when written down in black and white, but in reality, this is something that is expected as part of the overall customization and personalization of the services we use every day – both web-based and app-based. 
     
    Even though most people realize that many apps and social networking services do gather information , like location, names, username, and other data, users still value their privacy very highly and want control over how that data is collected, used, or shared. In order to make privacy a reality and not just an “anomaly”, apps need to have reasonable disclosure of information that the apps plan on collecting, with a clearly written menu that helps them to make thoughtful choices about what information they want to share. While information gathering is the standard, it doesn’t have to be overly intrusive; users should be respected by the developers behind the apps, with safety measures in place that protect sensitive data. 
     
    Is there really a problem, or are a few consumers just overreacting? A recent study from the Privacy Rights Clearinghouse suggests strongly otherwise:
     
    • Many apps send data in the clear – unencrypted -- without user knowledge.
    • Many apps connect to several third-party sites without user knowledge.
    • Unencrypted connections potentially expose sensitive and embarrassing data to everyone on a network.
    • Nearly three-fourths, or 72%, of the apps we assessed presented medium (32%) to high (40%) risk regarding personal privacy.
    • The apps which presented the lowest privacy risk to users were paid apps.  This is primarily due to the fact that they don't rely solely on advertising to make money, which means the data is less likely to be available to other parties.
     
    The Electronic Frontier Foundation (EFF) has a very thoughtful “Mobile User Privacy Bill of Rights” that includes these concrete, practical recommendations that developers can take into account to guard user privacy:
     
    Anonymizing and obfuscation: Wherever possible, information should be hashed, obfuscated, or otherwise anonymized. A "find friends" feature, for example, could match email addresses even if it only uploaded hashes of the address book.
    Secure data transit: TLS connections should be the default for transferring any personally identifiable information, and must be the default for sensitive information.
    Secure data storage: Developers should only retain the information only for the duration necessary to provide their service, and the information they store should be properly encrypted.
    Internal security: Companies should provide security not just against external attackers, but against the threat of employees abusing their power to view sensitive information.
    Penetration testing: Remember Schneier's Law: "Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break." Security systems should be independently tested and verified before they are compromised.
    Do Not Track: One way for users to effectively indicate their privacy preferences is through a Do Not Track (DNT) setting at the operating system (OS) level.
     
    This “hard coding” of privacy boundaries is known as “privacy by design”, and as consumers become increasingly more concerned and informed about what, exactly, their apps are asking for, it’s going to become the norm rather than the exception. Developers can get even more help from a convenient privacy policy generator called App Privacy, which takes you through a wizard to see how your app will use data and then generates a policy accordingly that lives right in the app itself. One user made their privacy policy as obnoxiously privacy-intruding as possible to see what would happen:
     

    “To truly test AppPrivacy, I decided to make my “HappyFunTimes” app as obnoxious as possible, so I checked the box that said I wanted to send marketing messages to my users’ contacts.“Warning!” a pop-up box read. “If you are going to access the user’s contacts database and use it for marketing purpose, you must have their permission first. Also, you should gain consent from any contact you plan to send marketing messages to.”” – “How coders should make their apps more privacy-friendly”, BusinessWeek.com
     
    How concerned are you about privacy – both for yourself and for your users? How are you providing privacy boundaries in the apps you are building? Share with us in the comments below.
     
     

     

  • Privacy
  • app privacy
  • user privacy
  • mobile privacy
  • Icon Image: 

  • Android*
  • Windows*
  • Developers
  • Havok Ends Beta, Officially Launches Project Anarchy; Announces PC Exporter

    $
    0
    0

    Official Release Expands Free Offering with Enhanced Prototyping and Autodesk® Scaleform®; PC Exporter Announced at $499 per Seat

    Havok™, a leading provider of 3D game development technology, recently announced that it has unveiled the first full release of its free end-to-end mobile game development engine, Project Anarchy™. The new version adds a range of features and enhancements including Autodesk’s Scaleform® for free, full support for Tizen OS, a new packaging tool for application data management, as well as expanded rapid prototyping through both Lua binding extensions and support for a new vSceneViewer mobile app that allows developers to quickly preview and share their work.

    Additionally, Havok announced the details of its Project Anarchy PC Exporter upgrade. Allowing users to utilize the familiar Project Anarchy toolset and release their mobile projects on PC, the Project Anarchy PC Exporter will ship in Q1 2014 and will be priced at $499 per seat.

    “We are thrilled to take Project Anarchy out of beta with this release,” said Ross O’Dwyer, Head of Developer Relations at Havok.  “Since making the beta available this summer, we’ve been listening to the users and working to expand our offering, not only with the features, but also the business models the community has asked for. With the accelerated prototyping and powerful UI experiences offered by this first full Project Anarchy release, we’re looking forward to even more developers coming on board and creating anarchy with us.”

    Project Anarchy includes Havok’s Vision Engine together with access to Havok’s industry-leading suite of Physics, Animation and AI tools as used in cutting-edge franchises such as Skyrim™, Halo, Assassin’s Creed®, and Uncharted. With features like an extensible C++ architecture, a flexible asset management system, advanced Lua debugging, rapid prototyping with the vSceneViewer functionality, full integration with fmod® and Autodesk Scaleform, and customizable game samples and tutorials, Project Anarchy offers game developers the ability to quickly iterate on their ideas and create incredible gaming experiences.

    For further information on Project Anarchy, developers can visit www.projectanarchy.com and www.havok.com

    About Havok:

    As a leading provider of games development technologies, Havok has over 13 years of experience servicing the most demanding technology requirements for leading customers in the commercial games and entertainment industry. A combination of superior technology and dedication to delivering industry leading support to its customers has led to the company’s technologies being used in over 500 of the best known and award-winning titles including Halo 4,Halo: Spartan Assault, Assassin’s Creed 4, Metal Gear Rising: Revengeance, Injustice: God Among Us, DmC: Devil May Cry, The Elder Scrolls V: Skyrim, Guild Wars 2,and Modern Combat 4: Zero Hour.

    Havok works in partnership with the world’s best known publishers, developer studios and developer teams, including Microsoft Game Studios®, Sony Computer Entertainment Inc., Nintendo®, Ubisoft®, NC Soft, Rockstar, EA, Bethesda ™, Insomniac, Relic, Bungie, Naughty Dog, Evolution Studios and Guerrilla Games. Its cross-platform, professionally supported technology is available for the Xbox One® and Xbox 360™ video game and entertainment systems, PlayStation®4 and PlayStation®3 computer entertainment systems, Wii™, Wii U, PlayStation Vita®,  Android™, iOS,  Windows® 8 (Desktop, Tablet and Phone), Windows 7, Apple Mac OS and Linux.

    Havok’s products have also been used to drive special effects in movies such as Harry Potter, Watchmen, James Bond, and The Matrix. Havok has offices in Dublin (Ireland), San Francisco, Seoul, Tokyo, Shanghai, and Germany. Havok is an Intel® owned company.

    Autodesk and Scaleform are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries.

     

  • Project Anarchy
  • havok
  • Icon Image: 

  • Android*
  • Phone
  • Tablet
  • Developers
  • Android*
  • Desarrollo y optimización de aplicaciones para Android* en la plataforma Intel® Atom™

    $
    0
    0

    Resumen

    Este documento introduce métodos detallados para el desarrollo y migración de una aplicación Android en la plataforma Intel Atom, y describe los métodos más conocidos para desarrollar aplicaciones mediante el Kit para desarrollo nativo para Android (NDK) y para optimizar el rendimiento. Los desarrolladores para Android pueden utilizar este documento como una referencia para crear aplicaciones de alta calidad para la arquitectura Intel.

    1. Clasificación de aplicaciones para Android

    Las aplicaciones para Android pueden clasificarse en dos tipos, tal como se muestra en la Ilustración 1.

    • Las aplicaciones Dalvik que incluyen código Java* y utilizan solo la API del SDK oficial para Android y los archivos de los recursos necesarios, tales como xml y png, compilados en un archivo APK.
    • Las aplicaciones NDK para Android que incluyen el código Java y los archivos de recursos, así como el código de recursos C/C++ y, algunas veces, el código ensamblador. Todo el código nativo se compila en una biblioteca vinculada dinámica (un archivo .so) y, posteriormente, Java la llama en el programa principal mediante un mecanismo de JNI.


    Ilustración 1: Dos tipos de aplicaciones para Android

    2. Kit de desarrollo nativo para Android

    2.1 Introducción

    El Kit de desarrollo nativo para Android (NDK) es una herramienta que complementa al SDK de Android. El NDK es una herramienta poderosa para el desarrollo de aplicaciones para Android debido a que:

    • Crea porciones cruciales para el rendimiento de sus aplicaciones en código nativo. Cuando se utiliza el código Java, el código fuente basado en Java necesita ser interpretado en un idioma informático mediante un equipo virtual. En cambio, el código nativo se compila y optimiza en código binario directamente antes de la ejecución. Con el uso apropiado del código nativo, puede crear código de alto rendimiento en su aplicación, tal como codificado y decodificado de video de hardware, proceso de gráficos y operaciones aritméticas.
    • Reutiliza el código nativo legado. Los códigos C/C++ se compilan en una biblioteca dinámica que el código Java puede llamar con un mecanismo de JNI.

    2.2 Descripción general de las herramientas

    Durante el período de desarrollo, puede utilizar Intel® Hardware Execution Manager (HAXM) para mejorar el rendimiento del simulador para Android. HAXM es un motor de virtualización con la ayuda de hardware (hipervisor) que utiliza la Tecnología Intel® de virtualización para acelerar la emulación de aplicaciones para Android en un equipo huésped. En combinación con imágenes del emulador para Android x86 que Intel proporciona y el Administrador de Android SDK oficial, HAXM resulta en una experiencia de emulación para Android más rápida en sistemas habilitados para la Tecnología Intel® de virtualización. Para obtener más información sobre HAXM, visite: http://software.intel.com/es-es.

    2.3 Instalación de HAXM

    Utilice el Administrador de Android SDK para instalar HAXM (recomendable) o, para hacerlo manualmente, descargue el instalador del sitio web de Intel. Si desea actualizarlo automáticamente, instálelo con el administrador del Android SDK, tal como se muestra en la Ilustración 2. [1]


    Ilustración 2: Instalar Intel HAXM con el Administrador de Android SDK

    También puede descargar un paquete de instalación adecuado desde http://software.intel.com/es-es/android a su plataforma huésped y seguir las instrucciones paso a paso para instalarlo.

    2.3.1 Configuración de HAXM

    Cuando se ejecuta HAXM, se requiere la imagen del sistema Android x86 que Intel proporciona. Puede descargar la imagen del sistema usando el Administrador de Android SDK o descargarla manualmente del sitio web de la Zona para desarrolladores Intel®.

    Después de que las imágenes se instalan satisfactoriamente, las imágenes del emulador de Intel® x86 Android se ejecutan automáticamente utilizando el binario del “emulator-x86” proporcionado con el Android SDK. La Tecnología Intel® de virtualización acelera el emulador Android, lo cual acelera su proceso de desarrollo.

    3. Desarrollo y migración de aplicaciones NDK para la Arquitectura de Intel Atom

    3.1 Desarrollo de aplicaciones NDK para los dispositivos basados en el procesador Intel Atom

    Después de instalar el NDK satisfactoriamente, dedique unos minutos a leer los documentos del directorio /docs/, especialmente los denominados OVERVIEW.html y CPU-X86.html, para familiarizarse con el mecanismo del NDK y cómo utilizarlo.

    El desarrollo de la aplicación NDK puede dividirse en cinco pasos que se muestran en la Ilustración 3:


    Ilustración 3: Proceso del desarrollo de la aplicación NDK

    Para ilustrar estos cinco pasos, se utiliza la demostración de hello-jni. Puede encontrar esta demostración en la carpeta NDK Root\samples\hello-jni [5]. La demostración de Hello-jni es una aplicación simple incluida en el NDK que obtiene una cadena de un método nativo en una biblioteca compartida y la utiliza en la interfaz de usuario de la aplicación.

    3.1.1. Creación del código nativo

    Cree un nuevo proyecto para Android y coloque su código fuente nativo bajo /jni/. El contenido del proyecto se muestra en la Ilustración 4. Esta demostración incluye una simple función en código nativo denominada Java_com_example_hellojni_HelloJni_stringFromJNI(). Tal como se muestra en el código fuente, devuelve una cadena simple de JNI.


    Ilustración 4: Creación del código nativo

    3.1.2 Creación de MakeFile ‘Android.mk’

    Las aplicaciones del NDK se crean para la plataforma ARM de manera predeterminada. Para crear aplicaciones NDK para la plataforma Intel Atom, es necesario agregar APP_ABI := x86 al MakeFile.


    Ilustración 5: Creación de MakeFile

    3.1.3 Compilación de código nativo

    Cree un código nativo al ejecutar la secuencia de comandos 'ndk-build' desde el directorio del proyecto. Se encuentra en el directorio principal de NDK. El resultado aparece en la Ilustración 6.


    Ilustración 6: Código nativo compilado

    Las herramientas de creación copian automáticamente las bibliotecas compartidas esenciales a la ubicación pertinente en el directorio del proyecto de la aplicación.

    3.1.4 Llamada del código nativo de Java

    Cuando despliega la biblioteca compartida satisfactoriamente, puede llamar la función desde Java. El código se muestra en la Ilustración 7. Se crea una llamada de la función nativa pública stringFromJNI() en el código Java y esta función carga la biblioteca compartida con System.loadlibrary().


    Ilustración 7: Llamada de código nativo desde Java

    3.1.5 Depuración con GDB

    Si desea depurar la aplicación NDK con GDB, es necesario satisfacer las siguientes condiciones:

    • La aplicación NDK se crea con 'ndk-build'
    • La aplicación NDK se establece en ‘debuggable’ en Android.manifest
    • La aplicación NDK se ejecuta en Android 2.2 (o superior)
    • Un solo destino está en ejecución
    • Hay que agregar el directorio adb a PATH

    Utilice el comando ndk-gdb para depurar la aplicación. Puede definir ya sea un punto de interrupción o una depuración paso a paso para realizar un seguimiento del historial de cambios de un valor variable, tal como muestra la Ilustración 8.


    Ilustración 8: Depuración de la aplicación NDK con GDB

    3.2 Migración de aplicaciones NDK existentes a dispositivos basados en el procesador Intel Atom

    En esta sección, se asume que tiene una aplicación Android para la plataforma ARM y necesita migrarla antes de desplegarla en la plataforma Intel Atom.

    La migración de las aplicaciones Android a la plataforma Intel Atom es similar al proceso de desarrollo. Los pasos se muestran en la Ilustración 9.


    Ilustración 9: Migración de aplicaciones Android a la plataforma Intel Atom

    3.2.1 Migración de aplicaciones Dalvik

    Las aplicaciones Dalvik pueden ejecutarse directamente en los dispositivos basados en el procesador Intel Atom. La interfaz de usuario necesita ajustarse para el dispositivo de destino. Para un dispositivo de alta resolución, tales como tabletas con una resolución de 1280*800 o superior, la asignación de memoria predeterminada posiblemente no cumpla con los requisitos de la aplicación, lo cual impide que pueda lanzarse. Se recomienda el incremento de la asignación de memoria predeterminada para los dispositivos de alta resolución.

    3.2.2 Migración de aplicaciones NDK para Android

    La migración de aplicaciones NDK es un poco más complicada que la migración de aplicaciones Dalvik. Todas las aplicaciones NDK se pueden dividir en tres tipos según las siguientes propiedades del código nativo:

    • Consiste en código C/C++ solo que no está relacionado con el hardware
    • Utiliza una biblioteca vinculada dinámica de terceros
    • Incluye código ensamblador que está altamente relacionado con plataformas que no son la arquitectura Intel

    Código nativo que consiste en código C/C++ solo que no está relacionado con el hardware

    1. Recompile el código nativo para ejecutar la aplicación satisfactoriamente en la plataforma Intel Atom.
    2. Abra el proyecto de NDK, busque el archivo Android.mk, agregue APP_ABI := armeabi armeabi-v7a x86 en Android.mk y reconstruya el código nativo con ndk-build.
    3. Si no encuentra el archivo Android.mk, utilice el comando ndk-build APP_ABI="armeabi armeabi-v7a x86" para crear el proyecto.
    4. Vuelva a encapsular la aplicación con las plataformas x86 compatibles.

    Si el código nativo utiliza una biblioteca vinculada dinámica de terceros, la biblioteca compartida debe volver a compilarse en la versión x86 para la plataforma Intel Atom.

    Si el código nativo incluye código ensamblador que esté altamente relacionado con las plataformas que no son de la arquitectura Intel, entonces el código debe volver a escribirse con ensamblador de la arquitectura Intel o C/C++.

    4. Métodos más conocidos para desarrollar el código nativo

    4.1 Alineación de memoria forzada

    Debido a diferencias entre las arquitecturas, plataformas y compiladores, puede que los tamaños de los datos de la estructura de los mismos datos en distintas plataformas sean diferentes. Sin la alineación de memoria forzada, pueden existir errores de carga para tamaños de datos incoherentes. [2]

    El siguiente ejemplo describe los tamaños de datos de la misma estructura de datos en distintas plataformas:

    struct TestStruct {
    int mVar1;
    long long mVar2;
    int mVar3;
     };

    Esta es una estructura simple con tres variables denominadas mVar1, mVar2 y mVar3.

    mVar1 es un int y utilizará 4 bytes
    mVar2 es long long int que utilizará 8 bytes
     mVar3 también es int, utilizará 4 bytes.

    ¿Cuántos espacios se necesitan en las plataformas ARM e Intel Atom?

    El tamaño de los datos compilados para las plataformas ARM e Intel Atom con un interruptor de compilador predeterminado se muestra en la Ilustración 10. ARM adopta automáticamente doble malign y ocupa 24 bytes, mientras que x86 ocupa 16 bytes.


    Ilustración 10: Memoria asignada por marcadores de compilación predeterminados

    La variable mVar2 de 8 bytes (64 bits) resulta en un diseño distinto para TestStruct debido a que ARM requiere una alineación de 8 bytes para variables de 64 bits igual que mVar2. En la mayoría de los casos, esto no causa problemas debido a que crear para x86 vs. ARM requiere una reconstrucción completa.

    No obstante, una incoherencia de tamaños podría presentarse si una aplicación serializa clases o estructuras. Por ejemplo, usted crea un archivo en una aplicación de ARM y escribe TestStruct en un archivo. Si posteriormente carga los datos desde ese archivo a una plataforma x86, el tamaño de la clase en la aplicación es diferente en el archivo. Problemas similares de alineación de memorias pueden ocurrir en el tráfico de redes que anticipa un diseño de memoria específico.

    La opción “-malign-double” del compilador GCC genera la misma alineación de memoria en x86 y en ARM.


    Ilustración 11: Memoria asignada cuando se agregan marcadores -malign-double

    4.2 Instrucciones de migración de NEON* a SSE [3]

    4.2.1 NEON

    La tecnología NEON* de ARM se utiliza principalmente en multimedios tales como aplicaciones de smartphones y HDTV. Según la documentación de ARM, su tecnología basada en el motor SIMD de 128 bits, la cual es una extensión de ARM Cortex*–A Series, ofrece al menos 3 veces más rendimiento que la arquitectura de ARMv5 y al menos 2 veces más que la versión subsiguiente: ARMv6. Para obtener más información sobre la tecnología NEON visite: http://www.arm.com/products/processors/technologies/neon.php.

    4.2.2 SSE: El equivalente de Intel

    SSE es la Streaming SIMD Extension para la arquitectura Intel (IA). En la actualidad, el procesador Intel Atom es compatible con SSSE3 (Supplemental Streaming SIMD Extensions 3) y las versiones anteriores, pero todavía no es compatible con SSE4.x. SSE es un motor de 128 bits que acepta el encapsulamiento de los datos de puntos flotantes. El modelo de ejecución comenzó con la tecnología MMX y SSx es esencialmente la generación más reciente que reemplaza la necesidad de MMX. Para obtener más información, consulte la sección "Volume 1: Basic Architecture" (Volumen 1: Arquitectura básica) de los Manuales para Desarrolladores de software para las Arquitecturas Intel de 64 y 32 bits. La descripción general de SSE, que se encuentra en la sección 5.5, proporciona las instrucciones para SSE, SSE2, SSE3 y SSSE3. Estas operaciones de datos mueven valores de puntos flotantes basados en precisión y de alto contenido entre los registros de XMM o entre los registros de MMX y la memoria. Los registros de XMM fueron diseñados para usarse como un reemplazo de los registros de MMX.

    4.2.3 De NEON a SSE a nivel de ensamblador

    Cuando utilice el Manual para desarrolladores de software de la arquitectura Intel como una referencia cruzada para todos los mnemotécnicos SSE(x) individuales, examine también las distintas instrucciones al nivel de ensamblador de SSE ubicadas en: http://neilkemp.us/src/sse_tutorial/sse_tutorial.html. Utilice la tabla de contenido para acceder los ejemplos del código o examine los antecedentes.

    De manera similar, el siguiente manual de ARM proporciona información sobre NEON y contiene pequeños fragmentos de código ensamblador en la sección 1.4,”Developing for NEON” (Desarrollo para NEON): http://infocenter.arm.com/help/topic/com.arm.doc.dht0002a/DHT0002A_introducing_neon.pdf.

    Diferencias clave al comparar el código ensamblador NEON y SSE:

    • Capacidades Endian. Intel es compatible solo con el ensamblaje de marcadores little-endian, mientras que ARM es compatible con el orden de marcadores endian grandes "big" o pequeños "little" (ARM acepta ambos, es bi-endian). En los ejemplos de código proporcionados, el código ARM es little-endian similar a Intel. Nota: Es posible que existan algunas implicaciones con el compilador de ARM. Por ejemplo, la compilación para ARM utilizando GCC incluye marcadores –mlittle-endian y –mbig-endian. Para obtener más información consulte http://gcc.gnu.org/onlinedocs/gcc/ARM-Options.html.
    • Granularidad. En el caso de los ejemplos de código de ensamblaje simple a los que hacemos referencia, compare las instrucciones de ADDPS para SSE (Intel) con VADD.ix para NEON tales como x = 8 ó 16. Observe que este último valor agrega cierta granularidad a los datos que se administrarán como parte del mnemotécnico al que se hace referencia.

    Nota: Estas diferencias no son todas las que existen. Puede ver otras diferencias entre NEON y SSE.

    4.2.4 De NEON a SSE a nivel de C/C++

    Pueden presentarse muchas dificultades de la API durante la migración del código C/C++ y NEON a SSE. Tenga en mente que asumimos que no se utilizará ensamblador inline, que en su lugar se utilizará el código C/C++ verdadero. Las instrucciones NEON también proveen algunas bibliotecas C nativas. Aunque estas instrucciones son código C, no pueden ejecutarse en la plataforma Intel Atom y deben volver a escribirse.

    5. Optimización del rendimiento de la aplicación

    5.1 Ajuste de rendimiento

    Durante el proceso de codificación, utilice los siguientes métodos para optimizar el rendimiento de su aplicación en la plataforma Intel Atom.

    5.1.1 Uso de Inline (en línea) en lugar Frequently Used Short (corto utilizado frecuentemente)

    Las funciones Inline son mejores para funciones pequeñas tales como el acceso a miembros de datos privados. Las funciones cortas son sensibles al impacto de las llamadas de funciones. Las funciones más largas demoran menos tiempo, de manera proporcionada, en la secuencia de llamadas/llamadas de regreso y se benefician menos de las funciones en línea. [4]

    La función Inline (en línea) ahorra la sobrecarga en:

    • Las llamadas de funciones (inclusive el paso de parámetros y la colocación de la dirección del objeto en la pila)
    • Preservación del marco de la pila de la rutina que llama
    • Nueva configuración de pila y marco
    • Comunicación de valor y retorno
    • Restauración de pila y marco anteriores
    • Retorno

    5.1.2 Uso de flotante en lugar de doble

    FPU es una unidad de punto flotante que es una parte de un sistema especialmente diseñado para llevar a cabo operaciones en los números de puntos flotantes, tales como: suma, resta, multiplicación, división y raíz cuadrada. Algunos sistemas (especialmente las arquitecturas antiguas y basadas en microcódigo) también pueden realizar varias funciones trascendentales tales como cálculos exponenciales o trigonométricos. Los procesadores actuales realizan estos cálculos con rutinas de bibliotecas de software. En la mayoría de las arquitecturas de computadoras modernas para fines generales, una o más FPU están integradas con la CPU [6].

    La plataforma Intel Atom tiene la FPU habilitada. En la mayoría de los casos, el usar puntos flotantes en lugar de dobles acelera el proceso de computación de datos y ahorra el ancho de banda de la memoria en los dispositivos basados en el procesador Intel Atom.

    5.1.3 Codificación de multisubprocesos

    El codificado de multisubprocesos le permite utilizar la función hyper-threading del procesador Intel Atom para aumentar los resultados y mejorar el rendimiento general. Para obtener más información sobre multisubprocesamiento, consulte: http://www.intel.com/content/www/es/es/architecture-and-technology/hyper-threading/hyper-threading-technology.html.

    5.2 Creación de aplicaciones de alto rendimiento con marcadores de compilador

    Como usted sabe, el código nativo es creado por GCC en aplicaciones para Android. Pero, ¿sabe cuál es el dispositivo de destino predeterminado de GCC? Es el procesador Pentium® Pro. El código binario de destino se ejecuta mejor en la plataforma Pentium Pro si no agrega ningún marcador durante la compilación del código nativo. La mayoría de las aplicaciones para Android se ejecutan en la plataforma Intel Atom en lugar de Pentium Pro. Se recomienda enfáticamente agregar marcadores específicos según su plataforma de destino. Puede agregar los siguientes marcadores recomendados durante la compilación en la plataforma Intel Atom:

    -march=atom
    -msse4
    -mavx
    -maes

    Para obtener más información sobre los parámetros del compilador, consulte: http://gcc.gnu.org/onlinedocs/gcc/i386-and-x86_002d64-Options.html.

    6. Conclusión

    Este documento describe la manera de desarrollar y optimizar las aplicaciones para Android en las plataformas Intel Atom, así como la manera de desarrollar y migrar aplicaciones del NDK.

    Un resumen de los puntos clave es:

    • La mayoría de las aplicaciones para Android pueden ejecutarse directamente en la plataforma Intel Atom. Las aplicaciones de NDK necesitan recompilar el código nativo. Si el código ensamblador se incluye en la aplicación, esta porción del código debe volver a escribirse.
    • Aproveche al máximo las funciones de la arquitectura Intel para mejorar el rendimiento de su aplicación para Android.
    • Agregue interruptores de compilación específicos a la plataforma para que el código de creación GCC sea más efectivo.

    Referencia

    1. http://software.intel.com/es-es/articles/installation-instructions-for-intel-hardware-accelerated-execution-manager-windows/
    2. http://software.intel.com/es-es/blogs/2011/08/18/understanding-x86-vs-arm-memory-alignment-on-android/
    3. http://software.intel.com/es-es/articles/ndk-android-application-porting-methodologies/
    4. http://msdn.microsoft.com/es-es/library/1w2887zk.aspx
    5. http://developer.android.com/sdk/ndk/index.html
    6. http://es.wikipedia.org/wiki/Unidad_de_coma_flotante

    Sobre el autor

    Dawei es un ingeniero de aplicaciones dedicado a la habilitación de aplicaciones para móviles, incluyendo el desarrollo y optimización de aplicaciones Android para dispositivos x86, y el desarrollo de aplicaciones con HTML5 para la web. Además, Dawei también tiene amplia experiencia en el diseño de interfaces de usuario y experiencias de usuario de aplicaciones.

    Avisos

    LA INFORMACIÓN QUE CONTIENE ESTE DOCUMENTO SE PROPORCIONA EN RELACIÓN CON LOS PRODUCTOS INTEL. ESTE DOCUMENTO NO CONCEDE LICENCIA ALGUNA, YA SEA EXPRESA O TÁCITA, POR EXCLUSIÓN U OTRO SUPUESTO, RESPECTO DE DERECHOS DE PROPIEDAD INTELECTUAL. A EXCEPCIÓN DE LO ESTABLECIDO EN LOS TÉRMINOS Y CONDICIONES DE VENTA DE INTEL PARA DICHOS PRODUCTOS, EN NINGÚN CASO INTEL SERÁ RESPONSABLE Y RECHAZA CUALQUIER GARANTÍA EXPLÍCITA O IMPLÍCITA CON RESPECTO A LA VENTAY/O EL USO DE LOS PRODUCTOS INTEL, INCLUIDAS LAS RESPONSABILIDADES O GARANTÍAS RELACIONADAS CON LA APTITUD PARA UN FIN DETERMINADO, LA COMERCIABILIDAD O LA INFRACCIÓN DE CUALQUIER PATENTE, DERECHO DE AUTOR U OTRO DERECHO DE PROPIEDAD INTELECTUAL.

    SALVO ACUERDO PREVIO POR ESCRITO REALIZADO POR INTEL, LOS PRODUCTOS INTEL NO SE HAN DISEÑADO PARA NINGUNA APLICACIÓN EN LA QUE UN FALLO DEL PRODUCTO INTEL PUEDA CREAR UNA SITUACIÓN EN LA QUE SE PUEDAN PRODUCIR DAÑOS PERSONALES O LA MUERTE.

    Intel puede realizar cambios en las especificaciones y descripciones del producto en cualquier momento, sin previo aviso. Los diseñadores no deben confiar en la ausencia de o en las características de cualquiera de las funciones o instrucciones marcadas como “reservada” o “no definida”. Intel las reserva para una definición futura y no se hará responsable de los conflictos o incompatibilidades que surjan de los futuros cambios realizados en las mismas. Esta información está sujeta a cambio sin previo aviso. No concluya un diseño con esta información.

    Los productos descritos en este documento pueden contener defectos de diseño o errores conocidos como erratas que pueden hacer que el producto varíe respecto a las especificaciones publicadas. Las erratas detectadas hasta el momento están disponibles a petición del interesado.

    Comunicarse con la oficina de ventas o el distribuidor local de Intel para obtener las especificaciones más recientes antes de hacer un pedido del producto.

    Para obtener copias de los documentos que tienen un número de pedido y que se mencionan en este documento o en otros documentos de Intel llame al 1-800-548-4725 o visite: http://www.intel.com/design/literature.htm El software y las cargas de trabajo utilizadas en pruebas de rendimiento pueden ser optimizadas para el rendimiento solo en microprocesadores Intel. Las pruebas de rendimiento, como SYSmark y MobileMark, se miden utilizando sistemas informáticos específicos, componentes, software, operaciones y funciones. Cualquier cambio en cualquiera de estos factores puede causar una variación en los resultados. Debe consultar otra información y pruebas de rendimiento para que le ayuden a realizar una evaluación completa de las compras previstas, incluido el rendimiento de ese producto en combinación con otros.

    Cualquier código de origen del software reimpreso en este documento se proporciona según una licencia de software y puede utilizarse o copiarse solamente según los términos de esa licencia.

    Intel, Atom, Pentium y el logotipo Intel son marcas comerciales de Intel Corporation en EE. UU. y/u otros países.

    Copyright © 2012 Intel Corporation. Todos los derechos reservados.

    *Las demás marcas y nombres podrían ser considerados como propiedad de terceros.

    **Este ejemplo de código fuente se publica según el Contrato de licencia de muestras de código fuente de Intel

  • education
  • Developers
  • Intel AppUp® Developers
  • Android*
  • Education
  • Intel® Atom™ Processors
  • Phone
  • URL
  • Education
  • Мобильная платформа на Intel® Atom™ пятого поколения — Merrifield

    $
    0
    0

    В феврале 2013 компания Intel® официально представила новую платформу Intel® Atom™ Clover Trail+. На этой платформе появился ряд устройств: телефонов и планшетов под управлением операционных систем Windows 8 и Android. Сейчас же, ближе к завершению года, уже можно сказать о том, что компания Intel® заявила о себе как о перспективном игроке на рынке мобильных устройств, и в предстоящем времени будет стараться расширить своё присутствие на рынке путём улучшения характеристик с известной стратегией Tic-tock

    Что известно сегодня?

    Преемником SoC CloverTrail+ станет решение для мобильных устройств, базирующееся на SoC Merrifield (22nm), уже показанное публике в сентябре этого года на Intel Developer Forum

    Что интересно, так это его архитектура Silvermont со значительно меньшим энергопотреблением (некоторые аналитики называют цифры в 4.7 раза меньшие, чем у предшественника - Saltwell, а так же, в отличие от предшественников, использующих по умолчанию для оптимизаций при компиляции микроархитектуру Intel® Atom™, для Merrifield’а в частности и для Silvermont в общем, доступен отдельный ключ для gcc/Android NDK/LLVM/ICC –march=slm для первых и -xatom_sse4.2 для ICC, включающий оптимизации:

    Besides moderate OOOE capabilities, Silvermont is better than 
    the Atom v1 uarch in the following ways 
    - branch misprediction penalty is 3 cycles shorter 
    - cache misses are now non-blocking (up to 8 outstanding misses) 
    - far fewer instructions are microcoded 
    - can decode 2 x87 instructions in one cycle 
    - no penalty decoding non-branch following branch 
    - no ADD vs LEA penalty for generating effective addresses 
    - dedicated integer multiplier in integer section. 
    - special handling of zeroing idioms 
    - store forwarding is significantly improved


    В том числе на этой архитектуре - Intel® Atom™ Silvermont - будет базироваться SoC BayTrail, предназначенный для использования в планшетных компьютерах

    Подробное описание можно найти в мануалах по ссылкам, приведённым выше, а так же достаточно доступно описаны архитектуры а и их сравнение на Tom's Hardware Guide.

    Оценивая предыдущий опыт выхода устройств с анонсированными SoC, можно предполагать, что первые и топовые устройства от вендоров скорее всего появятся уже в следующем году. И хотелось бы ожидать столь же интересных и солидных моделей, каким для CloverTrail+ стал Lenovo K900 для смартфонов. 

    Вслед за микроархитектурой Intel® Atom™ Silvermont в 2014 году Intel® планирует анонсировать Intel® Atom™ Airmont (14nm), о которой пока только можно строить догадки. 

    Устройства на SoC (Anniedale/Cherryview), базирующиеся на основе микроархитектуры Airmont будут носить кодовые имена Moorefield (для смартфонов) и Cherry Trail (для планшетов). Morganfield/Willow Trail, запланированные на 2015 год будут базироваться на архитектуре следующего поколения – Goldmont на SoC Broxton.

  • merrifield
  • clovertrail
  • airmont
  • silvermont
  • baytrail
  • atom
  • Intel
  • moorefield
  • morganfield
  • willow trail
  • cherry trail
  • goldmont
  • broxton
  • Icon Image: 

  • Android*
  • Phone
  • Tablet
  • Android*
  • Microsoft Windows* 8
  • Android in China: An Undiscovered App Market?

    $
    0
    0

    A new report from Chinese search engine company Baidu this week revealed mobile trends with Android users in China. A few of the most intriguing stats:

    • There are now over 270 million active daily Android users in China
    • This reflects a 13% overall growth in Q3 2013, as compared to a 55% growth rate in the same quarter a  year ago
    • Most Android device sales (52%) come from users upgrading to new Android phones; 48% are users purchasing a smartphone for the very first time
    • A large part of Android growth (45%) is focused in rural areas and small cities
    • Android owners spend upwards of 150 minutes a day on their smartphones (this is an increase of 26 minutes from the previous year), checking their devices an average of 53 times a day)
    • 44% of Android users in China use Wi-Fi for their access to the Internet, especially for video. 31% get their information from 2G networks, and 23% use 3G.
    • App downloads for Chinese Android device owners are growing exponentially: the average user downloaded 10.5 apps per month in Q3 2013; the previous year, it was 8.2 apps monthly
    • 15% of Android users in China install at least one new app a day vs. 11% in Q3 2012
    • 59%  use app stores to download their apps, while 13% use online app searches and 21% use their PCs to sideload apps onto their Android devices

    It’s worth pointing out that Chinese smartphone data is infamously difficult to obtain, mostly because a centralized app store (such as Google Play) does not work in the Chinese market and most Android devices don’t use Google services, therefore there is very little data for Google to work with. According to Chinese Android app store Wandoujia, more than 70 percent of Android smartphones in China do not offer Google Play services, which makes this data from Baidu quite valuable.  China and SE Asia are considered by most data measurement services to be the world’s largest smartphone markets; with over one billion Android devices being activated worldwide, the numbers coming out of  both China and SE Asia are mind-boggling when compared to global usage:

    From January to September 2013, consumers from Singapore, Malaysia, Thailand, Indonesia, Vietnam, Cambodia and the Philippines spent $10.8 billion on nearly 41.5 million smartphones, according to a new report from market research agency GfK Asia. Last year, they spent $7.54 billion on 25.8 million smartphones. – “Smartphone sales surge in SE Asia”, NextWeb.com

    However, since the road to app publishing and app stores is quite different than what most developers are used to,  it might be prudent to target third-party app distributors (such as the afore-mentioned Wandoujia).

    Lost revenue?

    The lack of a centralized app store in China is potentially damaging to developer revenues simply because of lack of access:

    "Reports estimate that China’s game industry brought in $9.7 billion in revenue last year across all segments, and the figure could grow to $21.7 billion by 2017. However, a latest report released by Chinese Android app store Wandoujia – which monitors trends in China’s mobile market based on its downloads — notes that foreign games, even if they are popular in China, are losing out on potential revenue by linking up with Google’s billing system. Wandoujia estimates that more than 70 percent of Android smartphones in China lack Google Play services. This has led to Chinese users being unable to make in-app purchases when playing foreign games such as Clash of Clans, which use Google’s in-app billing system." -  “Foreign games in China lose potential revenue by using Google in app billing”, The Next Web

    The sheer numbers of smartphone users – especially Android users – in China are enticing for developers looking to expand into new markets.  But it’s not just a matter of “getting in there”; there are many more factors that influence a successful app in the Chinese market as a recent article from Venture Beat  points out:

    • China’s feature phone market has dropped dramatically in the past 18 months, and 90%+ of those consumers are upgrading to entry-level Android smartphones. Since the capabilities of most Chinese smartphones still trail behind their EU/US counterparts, developers should expect to re-optimize their apps to run well on them.
    • China has several hundred Android app stores, many devoted to specific kinds of apps or users, while others hawk knock-off or hacked apps. However, just around 20 of all these app stores count as major players in the overall ecosystem. 
    • Many Western developers opt to work with a third-party company with a local presence in China and focus their distribution efforts only on China’s very largest app stores. 
    • For foreign developers operating without a local partner, it can be quite frustrating to contend with monetization, let alone the Chinese government’s 30% tax hit.

    In addition, while developers are used to certain app publishing/development guidelines, in China, things are very different. For example, each app must not only be localized for language, but also customized for Chinese social media products and cloud services. Between Chinese publishers and the government, developers should expect a hit of at least 30% off the top of their revenues. There are many, many rules and regulations you’ll have to follow, and each app store has their own specific guidelines. And how about getting paid? ReadWrite.com has this to say on the subject:

    “In many ways, China is still a cash-based society. This makes it difficult for developers to make money through app store purchases. In a similar way, Google Play is not easily accessible in China, which hampers Android app monetization. This means you’ll need to integrate the local online payment options that are popular, such as Alipay. You’ll also want to work directly with China’s three mobile carriers—China Telecom, China Unicom and China Mobile—because allow in-app payments directly billed on the consumer’s carrier payment plan. About 75% of app payment in China comes through direct carrier billing. International digital payments processor Fortumo has a relationship with all three carriers, which creates a doorway for Western developers.” – “7 things developers don’t know about the China mobile market but should”, ReadWrite.com

    Before you decide that it sounds like too much trouble to get an app into the Chinese market, consider these recent stats from App Annie:

    • According to Niko Partners’ 2013 Chinese Mobile Games Market Report, in 2012 there were 192 million mobile gamers in China. This year there will be 288 million. In 2014 there will be 390 million.
    • Mobile gaming is the fastest growing segment of the entire Chinese games market.
    • Even in smaller cities where not everyone owns a phone, the ratio of phones to people is more than 125%.
    • Mobile app users are spending 40% more time on their devices playing games in 2013 than they did in 2012, and they visit their favorite games 41% more often than in 2012
    • The official App Store is the only legitimate point of entry for iOS games, but for Android there are so many options that it is difficult to know where to turn.

    Have you made the move to the Chinese app market? What was your experience? Please share your knowledge here in the comments.

     

  • android
  • China app market
  • Android apps
  • Icon Image: 

  • Android*
  • Windows*
  • Developers

  • User Group Training Materials and Resources

    $
    0
    0

    This is training material that Intel will cover in your meetup.

    1. Intel Android Tools: This presentation demonstrates how to use Beacon Mountain, the Intel Hardware Accelerated Execution Manager (HAXM), Threading Building Blocks (TBB), Graphics Performance Analyzer (GPA) and the Intel Performance Primitives (IPP). In addition it provides links to the download locations for all of the tools discussed. (PDF)
    2. Debugging NDK Applications on IA: This presentation gives an overview on how to use the application debugger (adb and the new helper script (ndk-gdb) for debugging NDK generated machine code. In addition an example using the Valgrind tool suite is covered. (PDF)
    3. Intel Android Platforms Targeting: This presentation gives an overview on how one can target their new or existing application for IA Android. Packaging APKs for multiple CPU architectures is discussed as well as 3rd party library considerations. (PDF)
    4. Testing Native Android Applications: This presentation gives an example of testing your application using the Intel HAXM emulator. (PDF)
    5. Using WiDi on Android: This presentation covers what WiDi is and how one can use it to differentiate their application on IA Android devices. This implementation is being proposed in a W3C working group. (PDF)
    6. NFC Application Development on Android: This presentation walks one through the steps of using the NFC capability available on many devices. An example of how to read and format an NFC tag is illustrated. (PDF)
    7. Migrate Phone Apps to Tablet: This presentation covers some of the strategies to consider when migrating your phone app to a tablet. (PDF)
    8. Android on IA Resources. (DOC)
    9. Intel HTML5 for Android presentation (PDF)
  • Developers
  • Android*
  • Android*
  • URL
  • Intel® Graphics Performance Analyzers for Android* OS

    $
    0
    0

    Introduction

    The Intel® Graphics Performance Analyzers (Intel® GPA) suite is a set of powerful graphics and gaming analysis tools that are designed to work the way game developers do, saving valuable optimization time by quickly providing actionable data to help developers find performance opportunities from the system level down to the individual draw call.

    Intel® GPA now supports Intel® Atom™ based phones running the Google* Android* OS. This version of the toolset allows application and driver engineers to optimize their OpenGL* ES 1.0/2.0 workloads on these phones using your choice of development systems: Windows*, OS X*, or Ubuntu* OS. With this capability, as an Android* developer you can:

    • get a real-time view of over two dozen critical system metrics covering the CPU, GPU, and OpenGL* ES API
    • conduct a number of graphics pipeline experiments to isolate graphics bottlenecks

    To download a free copy of Intel GPA, browse to the Intel GPA Home Page, and click the Download button.

    Next Steps

    For more details on getting started with Intel GPA on the Android* OS, please refer to this article. You can also find more details on using Intel GPA by browsing the product's online help. The Intel GPA home page also contains links to product information, including information about analyzing DirectX* games on the Windows* OS platform, and related products that work with Intel GPA.

    If you want to be notified of Intel GPA product updates, use this link.

    As always, we welcome your suggestions, so please let us know what we can do to improve your use of these tools by posting your comments on the Intel GPA Support Forum.

    *Other names and brands may be claimed as the property of others.

  • vcsource_type_techarticle
  • vcsource_product_gpa
  • vcsource_domain_gamedev
  • vcsource_index
  • Developers
  • Android*
  • Android*
  • Intel® Graphics Performance Analyzers
  • Game Development
  • Phone
  • URL
  • Pros and Cons of HTML5 Cross-Platform Android* Mobile App Development Tools on Intel® Processor-based Devices

    $
    0
    0

    The success of mobile applications depends on reaching target customers regardless of the device they use. And cross-platform mobile application development tools help developers do that. A typical customer might use an Android smartphone to access the Internet on the go, a Windows* PC at work, or an Apple iPad* in a café.

    Instead of using complex technologies like Objective-C*, Xcode*, and iOS* APIs for Mac*, iPhone*, and iPad; Windows API, Visual Studio*, and C# for Windows PC; and Android APIs, Java*, and Eclipse* for Android smartphones, it is much easier and effective to use HTML5 cross-platform mobile application development tools. The key is to have the right features in your application that deliver a perfect user experience. Second, the app should have an inherent ability to work on new devices that continually hit the market.

    HTML5, a markup language, has several features that allow it to run on devices specifically designed to consume less electricity. For example laptops, smartphones, and tablets have processors that consume significantly less power than desktop devices. It is the preferred markup language used today to structure and present content on the Internet. Strategy Analytics predicts that more than a billion HTML5 compatible devices will be in use worldwide by the end of 2013. HTML5 can be used to develop interactive web pages for deployment across multiple operating system platforms and browsers. Hence, cross-platform mobile app development tools based on HTML5 are appealing to mobile app developers.

    Pros and cons of HTML5 cross-platform mobile application development tools

    Pros:

    Writing HTML5 code is relatively easy to learn and use compared to most of the technologies mentioned above. Organizations can save money by writing apps that work on all the operating systems instead of reworking the app for each OS. Also the code can be used as reference for projects based exclusively on Android, Windows, or iOS.

    Other advantages of its use are:

    • Allows development of applications that easily adapt to different resolutions, screen size, aspect ratio, and orientation.
    • Enables leveraging advanced capabilities like GPS, camera, and accelerometers in modern devices and deliver rich, contextual user experience across a range of devices like smartphones and tablets.
    • Applications can be deployed as local web applications and can also be viewed in browsers.
    • Mobile apps can use the same monetization and distribution channels as native apps.
    • Apps are not restricted by browser window frames and can run in a full-screen mode.
    • Users have full control over devices and display screen real estate.
    • Centralized code can be modified to interface with several devices.
    • JavaScript*, HTML, and CSS are the backbone of the Internet and web applications; therefore, migration of development tools to mobile devices is easier.
    • One-time exchange integration coding allows mobile applications to work similarly on all platforms, regardless of the device.

    Cons:

    Whenever there is a change or an added feature in iOS, Android, or Windows, the HTML5 cross-platform mobile app development tool has to reflect or factor in the change and make necessary adjustments to the code. This means cross-platform mobile application tools will always lag behind official SDKs.

    Other potential downsides are:

    • Sometimes delivery of mobile apps can take time as developers have to write code for each platform.
    • Developers work in languages that are not native to platforms, hence the efficacy of the code depends on the translation engine. Inefficient coding techniques and bloated code are a common occurrence.
    • Difference in platform runtimes often causes complications in development of cross-platform mobile applications.
    • The HTML5 standard evolved in a relatively short time, thus causing discrepancies in the implementation of CSS attributes, HTML tags, and JavaScript APIs. At times, these features behave differently on different platforms. However, there are tools available to address these discrepancies.

    These are some of the generic pros and cons of HTML5 cross-platform mobile app development tools.

    It is clear that the pros far outweigh the cons when it comes to HTML5 cross-platform mobile app development tools. HTML5 will continue to remain ‘the trend’ in the app development world. With this in mind, Intel has recently launched an HTML5 Development Environment to help developers create great mobile applications for all target devices, especially Intel® processor-based devices running Android.

    Pros and cons of HTML5 tools for developing Android cross-platform mobile apps for Intel processor-based devices

    With respect to Intel processor-based devices running Android, the HTML5 cross-platform development environment offers all the pros mentioned above, but there are several unique cons important to point out.

    Cons:

    • The HTML5 cross-platform mobile app development tool is a one-size-fits-all model. Over a period of time user needs change, and when this happens, the optimum solution would be to develop pure native apps that deliver great user experience and high device fidelity. This is possible only with a native platform.
    • Best-of-breed applications are not possible with HTML5 cross-platform development tools. Applications developed using these tools are essentially customized web sites that look and feel like custom applications.

    Even with the constraints listed above, HTML5 tools for Android cross-platform mobile apps for Intel processor-based devices help developers adapt to this new cross-platform approach so that they can deploy their apps and games on nearly all modern computing platforms. Thus trying these tools is indeed worthwhile.

    Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

    Copyright © 2013 Intel Corporation. All rights reserved.

    *Other names and brands may be claimed as the property of others.

  • Android*
  • URL
  • Using the touch screen in your HTML5 games on Android* devices powered by Intel

    $
    0
    0

    With the dramatic acceptance rate of smartphones and tablets has come an enormous shift in the way spontaneous experiences are delivered on these touch devices. Native application developers have incorporated touch events to improve the user experience and change the way users interface with their content. Mobile devices such as smartphones and tablets generally have a capacitive touch-sensitive screen to capture interactions made with the user's fingers. As the mobile web evolves to enable increasingly sophisticated applications, web developers need a way to handle these events. For instance, almost any fast game requires players to press several buttons at once, which, in the perspective of a touchscreen, infers multi-touch.

    Using the Canvas element in HTML5

    In this article we will dive into the touch events API furnished on Android* devices using HTML5. Further, we investigate what kind of applications can be developed and present some best techniques and strategies that make it easier to develop touch-empowered applications. Only recently have mobile applications been able to handle touch events using the HTML5 cross-platform technology. This opens up an entirely new path for making mobile applications more "native-like.” Assuming that we combine this touch event handling with HTML5's new canvas component, we can capture the user's movements and allow them to draw on the canvas component. HTML5 defines the <canvas> component as "a resolution-dependent bitmap canvas which can be used for rendering graphs, game graphics or other visual images on the fly." Per the web definition “Canvas consists of a drawable region defined in HTML code with height and width attributes. JavaScript* code may access the area through a full set of drawing functions.”

    Using HTML5 touch events, applications can give users the same great experience that they find with native applications. By offering touch and different gesture-based controls, app developers can help users swiftly traverse through apps or build fun and interactive games.

    Developer Guidelines

    Let’s plunge in and see how we can use the HTML5 canvas component to capture user touch events.

    As stated by W3C, “This specification defines the 5th major version, first minor revision of the core language of the World Wide Web: the Hypertext Markup Language (HTML).”

    There are four types of touch events:

    • touchstart
    • touchmove
    • touchend
    • touchcancel

    Each is described further below.

    touchstart

    This event is triggered when the user places a touch focus on the touch surface; the focus of this event must be a component.

    touchmove

    This event is triggered when the user moves a touch point along the touch surface.

    The focus of this event must be the same element that established the touchstart event when this touch focus was put on the surface, regardless of the fact that the touch focus has since moved outside the intelligent range of the target component.

    Note that the rate we can send touchmove events depends on the speed of the app’s execution, which depends on the hardware it’s running on.

    touchend

    This event is triggered when the user removes a touch point from the touch surface, in addition to cases where the touch point actually leaves the touch surface, such as being dragged off the screen.

    The goal of this event must be the similar element that received the touchstart event when this touch point was placed on the surface, regardless of the possibility that the touch focus has since moved outside the region of the target component.

    touchcancel

    This event is triggered when a touch focus has been disturbed in an implementation particular way. For example, a synchronous event or movement starts from the UI, cancelling the touch, or the touch focus leaves the archive window into a non-report region that is capable of handling user interactions. We might likewise dispatch this event type when the client places more touch focuses on the touch surface than the device or execution is designed to store. In this case the earlier touch object in the touchlist should be evacuated.

    To capture touch events and interpret them onto the canvas component, we should first find out how to use the canvas component. There are a couple of things that are top priority when using standard HTML inputs with canvas.

    • Text info fields raise the console (keypad) on mobile devices, blanketing 50% of the screen. To avoid this, we need to ensure that the significant parts of the canvas are not hidden by the console, or pick an alternate sort of input.
    • Buttons with default settings have a tendency to be very small on mobile devices. To make buttons easier to press with a finger, set a smaller viewport or a bigger introductory scale utilizing the <meta> tag, or make the button font bigger with the CSS font style.

    Let’s examine the implicit canvas API. We begin by describing the following methods from the canvas API:

    • Context getContext(String contextId);
    • addEventListener()

    The getcontext() method is used to acquire the rendering context and its drawing functions. addeventlistener() registers the designated listener on the EventTarget it is called on. The event target may be an element in an archive, the document itself, a window, or any viable item that strengthens events.

    Syntax

    target.addEvenListener(type,listener,Boolean)
    

    type

    A string representing the event type to listen for (touchstart, touchmove, touchend, touchcancel).

    listener

    Receives a warning when an event of a specified sort happens. This must be an object executing the eventlistener interface, or a JavaScript function.

    boolean

    Assuming that the value is true, this value shows that the user wants to start capturing the touch event. When a capture is initiated, all the specified events will be dispatched to the enlisted listeners before being dispatched to any EventTarget underneath the events in the DOM tree.

    Let’s look at some code.

    <canvas id= ”touch” width=”150” height=”150”>
    

    Sorry, your browser doesn't support the HTML5 element canvas.

    </canvas>
    

    This looks very similar to the <img> component; the main difference is that it doesn't have the <src> and <alt> qualities. The <canvas> component has just two aspects, width and height. If your renderings appear incorrect, try designating your width and height properties explicitly in the canvas characteristics, instead of with CSS. The width and height default to 300 and 150, respectively. The id will be adjusted to initialize the canvas using JavaScript, and the content inside will be used when the browser does not support it.

    var cans= document.getElementById(“touch);
    var canctx = canvas.getContect(‘2d’);
    

    The variable cans acquires the canvas for drawing the graphics objects on, and canctx holds the rendering context. In this case it is a 2d graphics object.

    This context encompasses the elementary methods for drawing on the canvas such as arc(), lineto(), and fill().

    document.addEventListener("touchstart", startTouchDrawing, true);
    document.addEventListener("touchmove", startMoveDrawing, false);
    document.addEventListener("touchend", endDrawing, false);
    document.addEventListener("touchcancel", cancelDrawing, false);
    

    When users touch/move their fingers on the phone screen, individual events will be created. We have to handle the events and draw the line accordingly. For example, the next piece shows the startMoveDrawing (x, y) function:

    function startMoveDrawing (event)
    { event.preventDefault();//To prevent default behavior
    var eventTouch, x, y;
    eventTouch = event.changedTouches[0];// Array to store previous touch cords,0 is the initial one.
    x = eventTouch.pageX;//The first touch for x co-ordinate
     y = eventTouch.pageY;// The first touch for y co-ordinate
    
    OnTouchMoveDraw(x, y);
    }
    function OnTouchMoveDraw (x, y)
    {
    	
    	if (x || y)
    	{
    		if ((start_x === -1) || start_y === -1)
    		{
    			start_x = x;
    			starty_y = y;
    		}
    		drawLine(x, y);
    

    Here, the first contention, that is, "starttouch," is the framework event that calls the user-defined function startTouchDrawing. In the code above preventDefault method anticipates the program’s default behavior and initiates the drawing. start_x and start_y are the previous touch positions. The drawLine(x, y) method draws the line.

    function drawLine(x, y) {
    	ctx.beginPath();
    	ctx.moveTo(start_x, start_y);
    	ctx.lineTo(x, y);
    	ctx.stroke();
    }
    

    The beginpath() method ensures that we are starting a new path. This begins another path wiping out the present path (if any). If you discover in animation you're getting an interfacing line from the end of your path to the start, it may be you haven't utilized the beginpath() capacity from the start. To generate a path with HTML5 Canvas, we can associate different subpaths. The completion purpose of every new subpath turns into the new setting focus. We can use the lineto(), arcto(), quadraticcurveto(), and beziercurveto() to develop every subpath that makes up our drawing. We can additionally utilize the beginpath() every time we need to start drawing another path (circle, line, rectangle, and so on).

    The line is drawn from x, y coordinates passed in the moveTo function to the x, y coordinates passed in the line function.

    To conclude, the touch screen has revolutionized the world of computing, and by using the canvas element in HTML5, touch events can be handled with great ease, making developers’ jobs much easier.

    Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

    Copyright © 2013 Intel Corporation. All rights reserved.

    *Other names and brands may be claimed as the property of others.

  • Android*
  • URL
  • Adding sound to your HTML5 games for Intel® Architecture-based Android* devices

    $
    0
    0

    Introduction

    Sound is one of the most significant components for building interactive games. A game requires not only a high calibre of graphics and alluring story line, but also sound impacts to awe the players. Adding sound effects to your game/application not only enhances its entertainment value, but also contributes to the application’s/game's general cachet of quality.

    Audio Tag

    Some of the new captivating characteristics of HTML5 are the sound and video tags. These could conceivably, in the long term, replace some of today’s popular video technologies. To use HTML5 audio or video, start by creating an <audio> element, specifying a source URL for the audio, including the controls attribute.

    <audio controls>
      <source src="horse.ogg" type="audio/ogg">
      <source src="horse.mp3" type="audio/mpeg">
    Your browser does not support the audio element.
    </audio> 
    

    The control trait includes sound controls, for example, play, stop, and volume. The <audio>component permits different <source> components. <source> components can connect to diverse <audio> records. MIME types (also known as Internet Media Types) are a way of characterizing file formats so that your framework knows how to grip them. Along with the source, we specify a type attribute. This attribute tells the program the MIME sort and the codecs of the supplied media before it downloads them. If the attribute is not provided, the browser will use an experimental approach to try and recognize the media type. The browser will utilize the initially distinguished configuration, and if it does not recognize the format,it plays the default one.

    canPlayType Method

    Luckily, the sound API furnishes a way for us to discover if a certain format is supported by the mobile browser. We can grab the <audio> element by marking up our element in HTML as below:

    	
    var audio = document.getElementById('myaudio');
    

    Otherwise, we can also generate our element completely in JavaScript*:

    var audio = new Audio();
    

    When we have our sound component, we are primed to enter its methods and properties. To test the format support, we can utilize the canplaytype technique, which takes a MIME type as a parameter:

    	
    audio.canPlayType('audio/mpeg’);
    

    canPlayType yields one of three values:

    1. probably
    2. maybe
    3. “” (the empty string)

    The explanation for why we have these odd return types is due to the general abnormality encompassing codecs. The mobile browser can figure if a codec is playable without attempting to play it.

    MIME Types for Audio Formats

    Attributes

    HTML tags are composed of one or more attributes/characteristics. Attributes are added back to a tag to furnish the browser with additional data about how the tag may show up or behave. Characteristics are composed of a name and a value differentiated by an equivalent (=) sign, with the value encompassed by double quotes. Here's an example, style "color: blue";

    The following section briefly describes the attributes that are particular to the <audio> tag/element.

    src: States the location of the audio file. Its value must be the URL of an audio file.preload: While playing large files, it is best to buffer records. To do this, use the preload attribute. This attribute permits us to offer a hint to the browser that we intend to buffer the record before playing itand accelerate the best client experience. Possible values are:

    • none
    • metadata
    • auto

    autoplay:

    States whether or not to start playing the audio as soon as the object has loaded.

    This is a boolean attribute. Accordingly, the presence of this attribute equates to a true value. We can also specify a value that is a case-insensitive match for the attribute's canonical name, with no leading or trailing whitespace (i.e., either autoplay or autoplay="autoplay").

    Possible values:

    • [Empty string]
    • autoplay

    mediagroup:

    This attribute is used for synchronizing playback of audio files (or media elements). It allows us to specify media elements to link together. The value is a string of text, for example: mediagroup=album. Audio files/media elements with the same value are automatically linked by the user agent/browser.

    A case of where the mediagroup quality could be utilized is when you have to overlay a sign-language translator track from one video over the top of the other.

    loop:

    This attribute states whether to keep replaying the audio once it has finished.

    This attribute has a boolean attribute. Accordingly, the presence of this attribute equates to a true value. We can also specify a value that is a case-insensitive match for the attribute's canonical name, with no leading or trailing whitespace (i.e., either loop or loop="loop").

    Possible values:

    • [Empty string]
    • loop

    controls:

    Instead of playing sounds automatically, which is not a good practice, you might as well let the browser present a few controls, for example volume and a play/pause bind. This might be carried out effectively by adding the controls attribute to the tag.

    This attribute has a boolean attribute. Accordingly, the presence of this attribute equates to a true value. We can also specify a value that is a case-insensitive match for the attribute's canonical name, with no leading or trailing whitespace (i.e., either controls or controls="controls").

    Possible values:

    • [Empty string]
    • controls

    Controlling media playback

    Once we have inserted the media into our HTML document utilizing the new components, we can automatically control them from JavaScript code. For instance, to begin (or restart) playback, we can do:

    	
    var v = document.getElementsByTagName(“myaudio”);
    v.play();
    

    The first line gets the first audio component in the archive, and the second calls the component's play strategy, used to actualize the media components. Regulating an HTML5 sound player to play, stop, increase, and decrease the volume with some JavaScript code is straightforward:

    	
    document.getElementById('demo').play() //Play the Audio
    document.getElementById('demo').pause() //Pause the Audio
    document.getElementById('demo').volume+=0.1 // Increase Volume document.getElementById('demo').volume-=0.1 // Decrease Volume
    

    Seeking through media

    Media components give support for moving the present playback position to particular focuses in the media's content. This is done by setting the value of the currenttime property on the component. Essentially, set the time to the number of seconds that you want the playback to proceed.

    We can utilize the component's seekable property to obtain the starting and ending time of the media. This returns a TimeRanges object, listing the ranges of times that you can seek to.

    	
    var audioElement = document.getElementById(“myaudio”);
    audioElement.seekable.start();  // Returns the starting time (in seconds)
    audioElement.seekable.end();    // Returns the ending time (in seconds)
    audioElement.currentTime = 122; // Seek to 122 seconds
    audioElement.played.end();      // Returns the number of seconds the browser has played
    

    SimpleGame library

    The simpleGame library makes it very easy to build new sounds by adding a Sound object. The Sound object in the simpleGame library is based on the HTML5 <sound> tag.

    <script type="text/javascript"
       src = "simpleGame.js"></script>
     <script type="text/javascript">
    

    You can easily manage the sound effects with the simpleGame library:

    1. Create the sound effect. The best formats are mp3 and ogg.
    2. Create a variable for holding your sound effect. Make sure to define the variable outside the function.
    3. The SimpleGame library has a Sound object. Create an instance of this for building your sound. The sound object requires a parameter. You can set the parameter in the init function.
    4. The sound can be played with the sound object’s play() method.

    Direct Canvas from AppMobi

    To supplement their HTML5 competencies, developers may want to explore the development tools and environment from AppMobi to build robust applications. The App Game Interface (AGI) technology by AppMobi provides hybrid HTML5 applications the capability to accelerate their canvas tag commands. The AGI technology was developed by AppMobi (http://www.appmobi.com), an HTML5 services company, initially known as directCanvas.

    To use the AGI, we must first understand how it is fabricated. The AGI accelerated canvas command needs to be stacked into its own "perspective" not unlike an HTML outline, where these summons are translated at a higher level and executed at a faster pace. Then again, this divided view does not incorporate access to the full archive object model (DOM), and should depend on a spanning charge to pass information between the standard Web view and the accelerated view.

    The code for the accelerated "perspective" is drawn underneath the HTML5 Web view, which implies that any graphical components incorporated in an AGI API provision's HTML document or documents will dependably be rendered above any accelerated graphics.

    Using the AGI sound features

    The App Game Interface (AGI) technology has illuminated many HTML5 sound weaknesses with multi-sound upgrades. HTML5 was not intended to play different offbeat sounds with low latency, yet that is precisely what games and different applications are in dire need of. The AGI multi-sound innovation permits every element in a game to play sound when it ought to, without respect to any possible simultaneous sounds. AppMobi APIs are all accessible through the Appmobi.context object and are intended to furnish enhanced execution and expanded usability.

    Three strategies can be utilized to control a solitary background sound:

    startBackgroundSound:

    This method begins a sound that plays continuosly in the background.

    A sole background sound may be sustained by the Accelerated Canvas App Game Interface. Use this method to begin a background sound or music. This command is included in addition to the Audio object to deliver enhanced performance and augmented ease of use.

    Syntax

    	
    AppMobi.context.startBackgroundSound("sounds/music_main.mp3",true)
    

    The first parameter is the path and filename of the background sound to play, while the second parameter is an optional boolean value signifying whether this background sound should recap intermittently or not.

    toggleBackgroundSound

     

    Use this command to toggle a background sound on or off. A sole background sound may be sustained by the Accelerated Canvas App Game Interface. Use this method to toggle a background sound or music. This command is included in addition to the Audio object to deliver enhanced performance and augmented ease of use.

    Syntax

    	
    AppMobi.context.toggleBackgroundSound();
    

    stopBackgroundSound

    Use this command to stop the background sound. A sole background sound may be sustained by the Accelerated Canvas App Game Interface. Use this method to stop a background sound or music. This command is included in addition to the Audio object to deliver enhanced performance and augmented ease of use.

    Syntax

    	
    AppMobi.context.stopBackgroundSound()
    

    Conclusion

    Despite some unpredictable browser behavior, HTML5 is an exhilarating technology for creating new and powerful applications. In this article, we have studied how to incorporate sound into our applications with the HTML5 audio component and looked at the AGI technology from AppMobi that provides additional tools for developing wonderful applications. We could combine other technologies/tools like JavaScript, phonegap, appmobi, etc. with HTML5 to unlock more opportunities to write applications that normally would require native code.

    More resources

    Some interesting demos of using sound in HTML5:

    http://www.createjs.com/#!/SoundJS/demos/visualizer

    http://www.createjs.com/#!/SoundJS/demos/game

    http://www.createjs.com/#!/SoundJS/demos/explosion

    Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

    Copyright © 2013 Intel Corporation. All rights reserved.

    *Other names and brands may be claimed as the property of others.

  • Android*
  • URL
  • 在Intel Android设备中安装和配置基于OpenCV的项目开发

    $
    0
    0

    随着安卓设备计算能力的快速增强,越来越多的开发者在Android设备中开发图形图像处理程序,比如人脸识别,眼球追踪,图片美化等等。OpenCV作为非常成功的图形图像处理库在PC上得到了非常广泛的应用,很多开发者也在Android设备中基于OpenCV进行开发图形图像处理程序。

    本篇给大家介绍一下如何在Intel Android设备 MOTO MT788中安装和配置基于OpenCV的项目开发。其余Android设备配置可以参考此版。 

    1、安装JDK

    (1)     配置环境变量

    a.       编辑系统环境变量path, 添加

    %JAVA_HOME%\bin;

    c.       cmd -> javac –version

    输出版本号,则成功

     

    2、             SDK Manager下载Intel x86 System Image

    使用SDK Manager下载image前请关闭eclipse,否则更新过程中会出现权限问题。

    因为bundle版自带的是Android 4.2以及 ARM EABI v7a System Image, 所以我们需要使用SDK Manager来安装Android 4.0.3Intel x86 Atom System Image.

    (1)     启动SDK Manager之后会自动检测Android版本:

    如果速度慢,则Tools -> Options设置下代理,下面是我用的代理

    代理设置后会很快检测到所有的Android版本,只有Android 2.3.3Android4.0.3以上版本才有Intel x86 Atom System Image, 如果没有检测到,关闭SDK Manager, 重新打开应该就可以检测到。

    (3)             CDT的安装

    打开Eclipse, 建议新建一个EyeTracking workspace的工作空间:

    Help -> Install New Software ->

    -> Add

    -> Archive

    选择 

    ->Select All

    按照默认安装完成,重启Eclipse

    File -> new -> Project

    若出现C/C++ Project则表示安装成功。

     5、NDK的在Eclipse中的配置

    (1)     打开Eclipse Window -> Preferences -> C/C++ -> Build ->Environment

    新建变量NDKROOT, 值为解压的NDK的根目录。

    这个变量是对当前的workspace有效。

    (3)     转化为C/C++工程

    右键刚刚建立的Android工程 -> New -> Convert to a C/C++ Project(如果没有就在Other里找)

     

    这里选Makefile project  --Other Toolchain--

    (5)     设置工程属性

    右键点击工程 -> Properties -> C/C++ Build

    Behaviour设置如下

     

    OK之后,控制台给出信息:

    这个时候项目还会报错,是因为.cpp文件中的头文件没有导入

    (7)             OpenCV配置

    (1)     OpenCV Library引入项目

    右键TestAndroidOpenCV -> Properties -> Android -> Add

    (3)     MainActivity.java中添加下面代码

    static{
        if(!OpenCVLoader.initDebug()){
            // Handle initialization error
        }
    }

    保证OpenCV能够首先initialize

     

    7、     虚拟设备

    a.       添加新设备

     

    c.     真机设备

    a.       右键Computer -> Manage -> Device Manager

    c.       安装驱动

    右键Android-Phone -> Properties -> Driver -> Update Driver

     

    目录填<Android-sdk>\extras\google\usb_driver

     

    安装成功之后

    这个时候打开cmd -> adb devices看下输出:

    如果没有检测到设备,推荐安装91手机助手,只要把它打开就能检测到

    AdbAndroid sdktools下的命令,建议把toolsplatform-tools都加入环境变量:

    (3)     选择调试设备

    右键项目 -> Run As -> Run Configurations -> Android Application -> 项目名称 -> Target

    可以根据需要进行选择。

     

    至此Android + OpenCV的环境已经基本搭建完成,从OpenCV官网上摘了个控制摄像头启动的例子,运行成功。

    Icon Image: 

  • Media Processing
  • Mobility
  • C/C++
  • Java*
  • Android*
  • Tablet
  • Developers
  • Partners
  • Students
  • Android*
  • Android* MediaPlayer Sample Code Walkthrough on Intel® Architecture

    $
    0
    0

    Introduction

    Media playback is becoming one of the most popular usage models on mobile devices. People expect to be able to play common multimedia types on their mobile devices and view video on the go. This document will discuss the basics of creating Android media applications, and provide a sample code walkthrough of the MediaPlayer API on Intel® architecture-based platforms.

    Android Multimedia Framework – MediaPlayer API

    The Android multimedia framework provides developers a way to easily integrate audio and video playback into applications, and supports most of the common media types. The MediaPlayer class is the key in Android multimedia framework. It can be used to play media on the local file system, media files stored in the application’s resources, as well as data streaming over a network connection.

    Managing the state

    MediaPlayer is a state-based class; it has an internal state machine managing all the states in the life cycle of a MediaPlayer object. The diagram below shows the state changes of the MediaPlayer object for the playback control. In this diagram, a single arrow represents synchronous method calls, and a double arrow represents asynchronous method calls and callbacks.


    Figure 1. State Changes of MediaPlayer Object. (Single arrows represent synchronous method calls, and double arrows represent asynchronous method calls)

    From the diagram, you can see that the MediaPlayer object has several states in the life cycle. At the beginning, creating a new MediaPlayer or calling the reset() method causes the MediaPlayer to go in to the Idle state. Idle state is where everything starts from, but playback cannot yet occur. If any playback controlling methods are invoked such as start(), stop(), seekTo(int), etc., a programming error will result.

    The next thing the app must do is call the setDataSource() method to point to a valid media source, causing the player to go in to the Initialized state. The source can be either a local file or data streaming from a network connection.

    In the Initialized state and before playback can be started, the app can call prepare() or prepareAsync() to go in to the Prepared state. The prepare() method does the work of fetching, buffering, and decoding the media file. However, prepare() can take a very long time to return when fetching media data from a network URL, particularly if the network connection is slow. So it’s not recommended to run prepare() in application UI thread, which might result in the application not responding and a bad user experience. Use prepareAsync() instead, which was designed to address this problem and provide a convenient way to prepare the media data while keeping the UI responsive. prepareAsync() runs in the background and returns immediately once done, sending out a OnPreparedListener.onPrepared() callback, in order to bring the MediaPlayer object in to the Prepared state.

    From the Prepared state, you can start the playback and control it by calling start() and seekTo(). Once the media playback is started, the MediaPlayer object goes into the Started state.

    After playback is started, you can control the playback using the pause() method to move it into the Paused state, or call start() to bring it back to the Started state. If the playback goes to the end and looping is not set to make it start over from the beginning, the MediaPlayer moves in to the PlaybackCompleted state. At this point, you can also call the start() method to start playback again, and the state goes back to the Started state. However, if you call stop() from Started/Paused/PlaybackCompleted, the state machine moves into the Stopped state. From here, you can go to the End state and release the MediaPlayer, or if you want to play the media again, you have to prepare the media data before calling start() again.

    You should always remember to call the release() method after each use to put the MediaPlayer object in to the End state; otherwise, it keeps consuming system resources. If you keep creating new MediaPlayer instances without calling the release() method, your application can consume all the system resources very quickly.

    If you registered OnErrorListener, the OnErrorListener.onError() callback method will be invoked in all error conditions, so that you can handle errors accordingly.

    Using MediaPlayer to play Audio and Video

    Now let’s take a look at the code for how to play audio and video using MediaPlayer.

    For Audio:

    private void playAudio(Integer media) {
        try {
            switch (media) {
                case LOCAL_AUDIO:
                    path = "/sdcard/Download/music/1.mp3";
                    mMediaPlayer = new MediaPlayer();
                    mMediaPlayer.setDataSource(path);
                    mMediaPlayer.prepare();
                    mMediaPlayer.start();
                    break;
                case RESOURCES_AUDIO:
                    mMediaPlayer = MediaPlayer.create(this, R.raw.test_cbr);
                    mMediaPlayer.start();
    
            }
        } catch (Exception e) {
            Log.e(TAG, "error: " + e.getMessage(), e);
        }
    
    }
    
    @Override
    protected void onDestroy() {
        super.onDestroy();
        // TODO Auto-generated method stub
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    
    }

    For Video:

    private void playVideo(Integer Media) {
        doCleanUp();
        try {
    
            switch (Media) {
                case LOCAL_VIDEO:
                    path = "/sdcard/Download/video/2.mp4";
                    break;
                case STREAM_VIDEO:
                    path = "Your URL here";
                    break;
            }
            mMediaPlayer = new MediaPlayer();
            mMediaPlayer.setDataSource(path);
            mMediaPlayer.setDisplay(holder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    
        } catch (Exception e) {
            Log.e(TAG, "error: " + e.getMessage(), e);
        }
    }
    public void onBufferingUpdate(MediaPlayer arg0, int percent) {
        Log.d(TAG, "onBufferingUpdate percent:" + percent);
    }
    public void onCompletion(MediaPlayer arg0) {
        Log.d(TAG, "onCompletion called");
    }
    public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
        Log.v(TAG, "onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }
        mIsVideoSizeKnown = true;
        mVideoWidth = width;
        mVideoHeight = height;
        if (mIsVideoReadyToBePlayed && mIsVideoSizeKnown) {
    		startVideoPlayback();
        }
    }
    public void onPrepared(MediaPlayer mediaplayer) {
        Log.d(TAG, "onPrepared called");
        mIsVideoReadyToBePlayed = true;
        if (mIsVideoReadyToBePlayed && mIsVideoSizeKnown) {
            startVideoPlayback();
        }
    }
    @Override
    protected void onPause() {
        super.onPause();
        releaseMediaPlayer();
        doCleanUp();
    }
    @Override
    protected void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
        doCleanUp();
    }
    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }
    private void doCleanUp() {
        mVideoWidth = 0;
        mVideoHeight = 0;
        mIsVideoReadyToBePlayed = false;
        mIsVideoSizeKnown = false;
    }
    private void startVideoPlayback() {
        Log.v(TAG, "startVideoPlayback");
        holder.setFixedSize(mVideoWidth, mVideoHeight);
        mMediaPlayer.start();
    }

    As you can see from the code, for both audio and video, the MediaPlayer moves through the life cycle and state changes according to the state change diagram we discussed above.

    Reference

    1. http://developer.android.com/reference/android/media/MediaPlayer.html
    2. http://developer.android.com/reference/android/widget/MediaController.html

    Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
    Copyright © 2013 Intel Corporation. All rights reserved.
    *Other names and brands may be claimed as the property of others.

  • media player
  • Developers
  • Android*
  • URL

  • Building FFMPEG for Android on x86

    $
    0
    0

    FFMPEG is a popular cross platform open source multimedia framework used in open source media applications such as Handbrake.  The default builds for FFMPEG are Linux, Windows, and OSX.  This blog shows a method for building the FFMPEG shared libraries for Android x86.  

    The setup used for this build was a VirtualBox Virtual Machine running Ubuntu Linux 12.04 64-bit.

    To build FFMPEG for Android x86:
    1.  Download the latest Android NDK from http://developer.android.com/tools/sdk/ndk/index.html

    2.  Download the latest FFMPEG source code from http://www.ffmpeg.org/download.html

    3.  Open a Terminal and cd into the FFMPEG directory

    4.  Configure ANDROID_NDK environment variable
        ANDROID_NDK=<your ndk path>
        PATH=$PATH:$ANDROID_NDK

    5.  Setup the toolchain
        DEST=`pwd`
        PREFIX=$DEST/build/android/x86
        TOOLCHAIN=/tmp/vplayer
        $ANDROID_NDK/build/tools/make-standalone-toolchain.sh --toolchain=x86-4.8 --arch=x86 --system=linux-x86_64 --platform=android-14 --install-dir=/tmp/vplayer

        export PATH=$TOOLCHAIN/bin:$PATH
        export CC="ccache i686-linux-android-gcc-4.8"
        export LD=i686-linux-android-ld
        export AR=i686-linux-android-ar

    6.  Configure FFMPEG
    ./configure --target-os=linux --arch=x86 --cpu=i686 --cross-prefix=i686-linux-android- --enable-cross-compile --enable-shared --disable-static --disable-symver --disable-doc --disable-ffplay --disable-ffmpeg --disable-ffprobe --disable-ffserver --disable-avdevice --disable-postproc --disable-encoders --disable-muxers --disable-devices --disable-demuxer=sbg --disable-demuxer=dts --disable-parser=dca --disable-decoder=dca --disable-decoder=svq3 --enable-network --enable-version3 --disable-amd3dnow --disable-amd3dnowext --enable-asm --enable-yasm --enable-pic --prefix=$PREFIX --extra-cflags='-std=c99 -O3 -Wall -fpic -pipe   -DANDROID -DNDEBUG  -march=atom -msse3 -ffast-math -mfpmath=sse' --extra-ldflags='-lm -lz -Wl,--no-undefined -Wl,-z,noexecstack'


    7.  Build FFMPEG for Android x86.  The include and shared library files will be in the build/android/x86 folder
    make clean
    make
    make install

     

    To learn more about FFMPEG, please visit the link below:
    http://www.ffmpeg.org/about.html    

    To learn more about VirtualBox, please visit the link below:
    https://www.virtualbox.org/

    To learn more about Ubuntu Linux, please visit the link below:
    http://www.ubuntu.com/download/desktop

    To learn more about Handbrake, please visit the link below:
    http://handbrake.fr/

     

    ++This sample source code is released under the Open Source Initiative OSI - The BSD License.

  • ffmpeg
  • android
  • encoding
  • decoding
  • transcoding
  • Icon Image: 

  • C/C++
  • Android*
  • Laptop
  • Tablet
  • Desktop
  • Developers
  • Android*
  • Saving and Restoring State in an Android App

    $
    0
    0

    A great user experience gives a user the ability to transition seamlessly between many apps on their device and then be able to pick up right where they left off when the app is launched again.  If an Android app is stopped on a device, the app can save the app state and then restore the state when the app is launched again by the user.  This blog shows a simple method to save and restore state in an Android app.


    When an app is launched for the first time, the default values determined by the developer are used for initializing the app variables.  When the app is stopped and launched again, the values saved will be used for initializing the app variables.  The example below shows an integer called gameState that gets saved when onStop() is called and then restored when onStart() is called.  

    @Override
    protected void onStart()
    {
            super.onStart();
         
            SharedPreferences settings = getSharedPreferences(getString(R.string.appSettings), MODE_PRIVATE);
            
            //Initialize to the default value if first run or restore the saved value
            gameState = settings.getInt(getString(R.string.gameState), GAMESTATE_DEFAULT_VAL);
    }


    @Override
    protected void onStop()
    {
        super.onStop();

        SharedPreferences settings = getSharedPreferences(getString(R.string.appSettings), MODE_PRIVATE);
            SharedPreferences.Editor editor = settings.edit();

            //Save Value
            editor.putInt(getString(R.string.gameState), gameState);
            editor.commit();
    }

     

    To learn more about the lifecycle of an Android activity, please see the link below:
    http://developer.android.com/training/basics/activity-lifecycle/starting.html


    To learn more about Saving Data, please see the link below:
    http://developer.android.com/training/basics/data-storage/index.html

     

    ++This sample source code is released under the Open Source Initiative OSI - The BSD License.

  • saving state
  • restoring state
  • android activity lifecycle
  • app lifecycle
  • Icon Image: 

  • Java*
  • Android*
  • Laptop
  • Tablet
  • Desktop
  • Developers
  • Android*
  • Beacon Mountain : Processeurs x86 et outils de développement Intel

    $
    0
    0

     

    Développant sur Android depuis deux ans, et ayant découvert Tizen lors du hackathon BeMyApp/Intel/Samsung à Palo Alto en Octobre 2013, je m’intéresse aujourd’hui aux processeurs x86 et aux outils de développement qu’Intel met à disposition pour les appareils mobiles.

     

    Environnement

    Je suis équipé de Mac OS X (Mavericks) tournant sur un Mac Book Air de juin 2013, d’Android Studio 0.3, ainsi que d’une Samsung Galaxy Tab 3 qui est basée sur une architecture x86 d’Intel, et d’un smartphone Tizen.

     

    Installation des outils de développement

    Screenshot Beacon Mountain folder

    Direction le site web d’Intel. Après avoir renseigné mes coordonnées, j’ai pu télécharger la version Beta de Beacon Mountain via un lien reçu par email. Il existe deux formats au choix : le premier est le pkg classique, et le deuxième est destiné aux connections lentes. J’ai chois

    i le premier et l’installateur s’est exécuté sans aucune difficulté, téléchargeant lui-même d’autres fichiers requis pour l’installation (environ dix minutes pour une connection de 2Mo/s). Une fois installé, l’ensemble prend environ 3Go sur le disque dur. Après installation, Intel Software Manager vérifie régulièrement que les logiciels de développement sont à jour. Parmi les outils installés, on retrouve le SDK et le NDK Android, Android Design, ainsi que Intel Threading Building Blocks, Intel Integrated Performance Primitives, Intel GPA Graphic Performance Analyser, et Intel Hardware Accelerated Execution Manager.

     

    Les extras d’Intel

    Beacon Mountain contient des versions d’Eclipse, du plugin ADT, du SDK et du NDK qui sont les mêmes que celles que propose developer.android.com. L’expérience utilisateur ne change pas de ce côté là. En revanche, Intel a rajouté quelques outils :

    TBB et IPP

    Intel Threading Building Blocks et Intel Integrated Performance Primitives sont deux bibliothèques qui permettent d’optimiser les applications, vis-à-vis notamment du multi-threading.

    GPA

    GPA usage

    Grâce à Intel GPA Graphic Performance Analyser, il est possible d’analyser l’exécution d’une application pour l’optimiser sur les plateformes x86. Le GPA affiche en temps réel l’utilisation du CPU, du GPU, et une vingtaine d’autres paramètres.

    HAXM

    HAXM installation screenshot

    Intel Hardware Accelerated Execution Manager permet d’accélérer l’exécution des machines virtuelles Android x86. Pour l’installer, il suffit d’aller dans l’Android SDK Manager et de sélectionner Intel x86 Emulator Accelerator (HAXM).

     

    Un aperçu de ces outils est disponible en vidéo sur l’Intel Software TV : http://www.youtube.com/watch?feature=player_embedded&v=Kfr241Mf7wY

     

    Machines virtuelles

    Android AVD screenshot

    Si l’AVD Manager installé conjointement avec l’Android SDK détecte les machines virtuelles que  j’avais préalablement créées, il est toute fois incapable d’afficher leurs informations ou de les démarrer. Heureusement, l’AVD Manager couplé avec mon installation d’Android Studio peut s’en charger, et les machines virtuelles sont alors utilisables dans le debugger d’Eclipse téléchargé par l’installateur d’Intel.

     

    Screenshot SDK Manager

    Intel propose une image d’Android pour l’architecture x86. Bien que cette image ne soit pas encore accessible pour la version 4.4, elle permet de tester le comportement de son application pour la plupart des versions, et pour toutes les architectures de processeurs. Il est possible de tester cette image système à partir de n’importe quelle installation du SDK d’Android, via l’Android SDK Manager.

     

    Développer avec la Samsung Galaxy Tab 3

    Certains auront peut être la surprise de découvrir que l’option USB debugging a disparu des paramètres du système. C’est une nouveauté d’Android 4.2 : les options de développement sont cachées de l’utilisateur lambda. Pour les faire apparaître à nouveau, il faut se diriger vers le menu Settings→About Device et appuyer 7 fois sur Build number. Le menu Settings→Developer options devrait reprendre sa place, et on peut dès lors activer l’option Settings→Developer options→USB debugging.

    Il ne reste plus qu’à connecter la tablette à l’ordinateur via USB pour debugger vos applications. A noter qu’il est aussi possible de se passer de câble USB si vous cochez l’option Settings→Developer options→ADB over network et utiliser directement le réseau Wifi. Pour plus de détails, je propose de lire le tutoriel de Martin Mikkelborg Syvertsen (stuffandtech.blogspot.fr/2012/03/android-quick-tip-adb-over-wifi.html).

     

    Performance

    Par défaut, les applications utilisant du code natif sont compilées uniquement pour les processeurs ARM. Les appareils équipés de processeurs x86 comme la Samsung Galaxy Tab 3 doivent donc interpréter ce code pour l’exécuter. Ce processus ralentie l’application, rendant certains jeux inutilisables. Heureusement, il est très simple de compiler l’application pour les processeurs x86, et le gain en performance est énorme.

     

    Prenons l’exemple Native-Plasma fourni avec le NDK. Vous pouvez l’importer dans votre workspace depuis le dossier CHEMIN_DU_NDK_ANDROID/samples/. Après avoir importé le projet, vous pouvez compiler le code natif en ouvrant une invite de commande dans le dossier CHEMIN_DU_PROJET/jni/ et en lançant la commande ndk-build. Il vous suffit ensuite de générer l’apk à partir d’Eclipse et de regarder le LogCat pour observer les performances : 

        frame/s (avg,min,max) = (3.9,3.5,5.8)

        render time ms (avg,min,max) = (254.9,172.1,278.7) 

    Soit un moyenne de 254.9 ms pour générer une seule frame !

     

    Préparons maintenant la compilation pour l’architecture x86. Il vous suffit d’ajouter ou de compléter le fichier CHEMIN_DU_PROJET/jni/Application.mk avec la ligne suivante :

        APP_ABI := armeabi armeabi-v7a x86

     

    Relançons la compilation du code natif avec la commande ndk-build, et testons à nouveau l’application sur la Samsung Galaxy Tab 3 :

        frame/s (avg,min,max) = (9.5,7.1,11.2)

        render time ms (avg,min,max) = (104.5,88.8,140.3)

    Il faut 104.5 ms pour générer une frame, soit 40% de la durée précédente. On ressent bien l’optimisation de la compilation pour l’architecture x86.

     

     

     

    Sources

    http://gs4.wonderhowto.com/how-to/enable-hidden-developer-options-your-samsung-galaxy-s4-0146687/

    http://software.intel.com/en-us/vcsource/tools/beaconmountain

    http://software.intel.com/en-us/articles/speeding-up-the-android-emulator-on-intel-architecture

    http://www.youtube.com/watch?feature=player_embedded&v=Kfr241Mf7wY

    http://software.intel.com/en-us/articles/intel-graphics-performance-analyzers-for-android-os

    http://static.electronicsweekly.com/eyes-on-android/wp-content/uploads/sites/8/2013/05/Intel-BeconMountain-GPA.jpg

    Icon Image: 

  • Development Tools
  • Intel® Atom™ Processors
  • Mobility
  • Optimization
  • Threading
  • C/C++
  • Java*
  • Android*
  • Tablet
  • Developers
  • Students
  • Android*
  • Apple Mac OS X*
  • Creating your first HTML5 spaceship game for the Android* OS on Intel® Architecture

    $
    0
    0

    Introduction

    I'm certain most of us have some insane or not so insane video game plans in mind. The majority of these thoughts are never acted on as many people think game coding is exceptionally hard to do. Indeed that is true to a degree, but it is not as hard as you may think.

    If you have a fundamental understanding of HTML, CSS, and JavaScript*, you have all the requisites to start a straightforward project.

    Adding a Canvas element to a web page

    One of the most exciting features of HTML5 is the <canvas> element that can be used to draw vector graphics and engender astonishing effects, interactive games, and animations The web defines canvas, as a rectangular area that allows for dynamic, scriptable rendering of 2D shapes and bitmap images. The HTML5 Canvas is perfect for creating great visual results that augment UIs, diagrams, photo albums, charts, graphs, animations, and embedded drawing applications. HTML5 Canvas works with JavaScript libraries and CSS3 enabling you to create interactive web-based games and animations.

    The elementary code for using and setting a canvas looks like this:

    <body onload="spaceShipGame()">
        <h1>
          SpaceShipGame
        </h1>
        <canvas id="spaceCanvas" width="300" height="300">
        </canvas>
     </body>
    

    This looks very similar to the <img> element, the difference being that it doesn't have the src and alt attributes. The <canvas> element has only two characteristics, width and height. If your renderings seem inaccurate, try designating your width and height attributes explicitly in the <canvas> attributes, instead of CSS. The width and height attributes default to 300 and 300, respectively. The id will be acclimated to initialize the canvas using JavaScript, and the text next to the equal to sign will be used as a call back when the mobile browser doesn’t support it.

    Drawing the background and spaceship for a game using HTML5 canvas and JavaScript

    canvas = document.getElementById("spaceCanvas");
    ctx = canvas.getContext("2d");
    

    The variable canvas creates the canvas that we need to draw graphics objects, and ctx holds the rendering context. In this case it is a 2d graphics object.

    This context contains the elementary methods for drawing on the canvas such as arc(), lineto(), and fill().

    Next we paint the background black, place shiny asteroids on it, and draw the spaceship using the context object.

    
    // Paint it black
              ctx.fillStyle = "black";
              ctx.rect(0, 0, 300, 300);
              ctx.fill();
    
             // Draw 100 stars.
             for (i = 0; i <= 100; i++) {
             // Get random positions for stars.
             var x = Math.floor(Math.random() * 299)
             var y = Math.floor(Math.random() * 299)
    
              // Make the stars white
              ctx.fillStyle = "white";
    
              // Give the spaceship some room.
              if (x < 20 || y < 20) ctx.fillStyle = "black";
    
              // Draw an individual star.
              ctx.beginPath();
              ctx.arc(x, y, 3, 0, Math.PI * 2, true);
              ctx.closePath();
              ctx.fill();
            }
    
    //drawing the spaceship
           ctx.beginPath();
            ctx.moveTo(28.4, 16.9);
            ctx.bezierCurveTo(28.4, 19.7, 22.9, 22.0, 16.0, 22.0);
            ctx.bezierCurveTo(9.1, 22.0, 3.6, 19.7, 3.6, 16.9);
            ctx.bezierCurveTo(3.6, 14.1, 9.1, 11.8, 16.0, 11.8);
            ctx.bezierCurveTo(22.9, 11.8, 28.4, 14.1, 28.4, 16.9);
            ctx.closePath();
            ctx.fillStyle = "rgb(0, 0, 255)";
            ctx.fill();
            ctx.beginPath();
            ctx.moveTo(22.3, 12.0);
            ctx.bezierCurveTo(22.3, 13.3, 19.4, 14.3, 15.9, 14.3);
            ctx.bezierCurveTo(12.4, 14.3, 9.6, 13.3, 9.6, 12.0);
            ctx.bezierCurveTo(9.6, 10.8, 12.4, 9.7, 15.9, 9.7);
            ctx.bezierCurveTo(19.4, 9.7, 22.3, 10.8, 22.3, 12.0);
            ctx.closePath();
            ctx.fillStyle = "rgb(255, 0, 0)";
            ctx.fill();
    
    When we execute the code, our screen looks like the picture shown in Figure 1.



    Figure 1

    Next we are going to move the spaceship in our game using HTML5 canvas and JavaScript. The code sample is composed with HTML5 and JavaScript and demonstrates how to move the spaceship over the star field. Canvas uses quick mode to create this moving image. Two steps are used to execute this in the game application. We should redraw the image every time we move it. Then we should re-establish the groundwork (background) that was destroyed when we drew the spaceship over it.

    The code below demonstrates how we can delete the image of the spaceship as it moves and draw another one at the new location. It shows how to spare and restore previews of the spaceship and the background. Canvas does not memorize the trajectory regarding where the pixels are drawn, so the example demonstrates the best practices to keep the trajectory for each pixel as it is recovered, eradicated, moved, and restored in our code.

    Let’s look at declaring some of the variables:

    var newBackground = new Image(); // This variable will be used to store the new background.
    
    var oldBackground = new Image(); // This variable will be used to store the old background.
    var ship = new Image(); // This variable captures the spaceship image
    
          var spaceShipX = 0; // The current X position of the spaceship
          var spaceShipY = 0; // The current Y position of the spaceship
          var old_SpaceShipX = 0; // The old X position of the spaceship
          var old_SpaceShipY = 0; // The old Y position of the spaceship
    

    Setting up a game loop to process the main game events

    Loopgame()
    {
    

    The code below stores the old background so we can erase the ship and then draw the spaceship at the new position. This looks as if the ship has moved.

            ctx.putImageData(oldBack, old_ShipX, old_ShipY);
            ctx.putImageData(spaceShip, spaceShipX, spaceShipY);
    }
    

    To get the image information for every pixel of a rectangular range on the canvas, we can get the image data object with the getimagedata() strategy for the canvas setting and afterward access the pixel information from the information property. Each pixel in the picture information holds four parts: a red, green, blue, and alpha part.

    To play the game, we need to call the Loopgame() method. We do this by calling the setInterval method. The setInterval() method requests a function or assesses an expression at definite intervals. This LoopGame is called in 20-millisecond loops, which are very common in games because they accomplish basic animation tasks at fixed intervals.

    setInterval(Loopgame, 20);
    

    Animation in Canvas

    The <canvas> tag in HTML can accomplish animation in two ways.

    1. The usual way and the one used for most animation is where the screen is totally redrawn for every movement. This works better when you have a modest canvas drawing surface and a faster processor. It is not suggested for bigger or more unpredictable animation.
    2. The second way Canvas makes movements is the one we used in this article’s code sample. This technique is suggested when you have a bigger, more complex Canvas animation and is the recommended technique to use. In spite of the fact that it takes more code to set it up, it runs much faster than the more normal style of activity.

    Because this round circle refreshes every 20 milliseconds, the human eye does not see that the spaceship is erased; the spaceship just looks like it’s moving. This diminishes the danger of eye damage from screen flickering between every move since just a little part of the screen is drawn every time.

    We set up the addEventListner to handle the touch events and capture the new X and Y positions and place our spaceship accordingly.

    document.addEventListener("touchstart", movespaceShip, true);
    

    When a user touches/moves a finger on the phone screen, the movespaceShip() event is triggered right away.

    function startMoveDrawing (event)
    { event.preventDefault();//To prevent default behavior
    var eventTouch, x, y;
    eventTouch = event.changedTouches[0];// Array to store previous touch cords,0 is the initial one.
    x = eventTouch.pageX;//The touch for x co-ordinate
     y = eventTouch.pageY;// The touch for y co-ordinate
    }
    
    

    So far we have seen how to draw/change the background and spaceship for our game. If we hit an asteroid, our spaceship will explode. So to get back securely to home base, we should stay away from asteroids or blow them up before we collide with them. To do this, we can use canvas to retain a record of each pixel on the screen.

    The first stage is to yield a depiction (snapshot) of the screen at the point where the spaceship is about to move. This location has already been calculated using eventTouch.pageX and eventTouch.pageY;.

    The next stage is to test the snapshot data. We hunt for a red value (255) in a chunk of 4 bytes. If we find it, an asteroid is present. We say in a chunk of 4 bytes because each pixel is assigned a color value that consists of four parts: red, green, blue, and alpha. Alpha denotes the amount of pellucidity (transparency). For example, a black pixel is made up of 0% red, 0% green, 0% blue, and 0% alpha.

    Using the above mentioned technique and code, we know you’ll be able to turn this code snippet in to a full-fledged spaceship game in no time.

    Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.

    Copyright © 2013 Intel Corporation. All rights reserved.

    *Other names and brands may be claimed as the property of others.

  • Android*
  • URL
  • Tout savoir sur le WiDi !

    $
    0
    0

    Intel en parle beaucoup ces temps-ci car ses dernières générations de processeurs le supportent de façon native mais tout le monde ne connaît pas forcément cette nouvelle technologie. Pour bien comprendre de quoi il s'agit, il faut d'abord comprendre ce qu'est Miracast.

    Miracast

    On va faire simple : branchez votre tablette sur votre télévision au moyen d'un câble HDMI, vous obtenez alors un système double écran, imaginez maintenant faire la même chose mais sans câble HDMI pour vous limiter dans vos déplacements !

    Miracast est une norme du consortium WiFi Alliance qui, grâce à la combinaison de plusieurs technologies, permet de projeter du contenu depuis votre ordinateur, tablette ou smartphone vers votre télévision compatible sans avoir besoin de les connecter au moyen d'un câble. Ces technologies sont :

    • WiFi Direct : il s'agit de la possibilité d'établir une connexion paire à paire entre deux machines sans qu'elles aient besoin de passer par un routeur
    • H.264 : c'est une norme de compression d'un signal vidéo (codec MPEG-4 Part 10 appelé aussi MPEG-4 AVC) pouvant aller jusqu'à 60 images / secondes en Full HD (1080p)
    • Un flux audio stéréo (2 canaux, PCM, 44 ou 48 kHz)
    • Le HDCP 2.0, un protocole de gestion des DRM autorisant la lecture de Blu-ray ou de VOD
    • Le WPA2, un protocole utilisé un peu partout pour sécuriser les connexions WiFi, évitant que n'importe qui prenne le contrôle de l'affichage et du son de votre télévision

    Voilà, techniquement, Miracast est la combinaison de tout ça en une norme implémentée par des constructeurs directement dans leurs télévisions ou dans des boîtiers à connecter en HDMI sur un téléviseur classique pour profiter de cette technologie.

    WiDi

    WiDi (pour Wireless Display) est l'implémentation de la norme Miracast par Intel associée à un programme de certification de matériel. Les deux principales différences techniques sont une résolution 1080p garantie (alors que Miracast garantit un minimum de 720p) et un temps de latence réduit (inférieur à 50 ms) dans la transmission des signaux.

    Le programme de certification concerne aussi bien le matériel émetteur (les smartphones, tablettes et ordinateurs) que le matériel récepteur (téléviseurs, boîtiers et projecteurs) et de nombreux constructeurs tels que Samsung, LG, Dell, Sharp et Philips sont déjà engagés dans le programme et ont obtenu la certification pour leurs produits.

    Les processeurs de la famille Haswell (les derniers core i3 / i5 / i7) et ceux des familles Bay Trail (les derniers Atom pour tablettes) et Clover Trail + (Atom pour smartphones) embarquent presque tous la technologie de façon native.

    Côté systèmes d'exploitation, WiDi est supporté nativement par Android 4.2 et Windows 8.1, ces deux systèmes fournissent des interfaces utilisateurs pour la connexion à un matériel récepteur ainsi que des APIs afin que les développeurs puissent créer des applications introduisant de nouveaux usages grâce à cette technologie.

    Les possibilités du WiDi

    D'une façon traditionnelle, il est possible de faire du multi-écrans classique, par exemple dupliquer ou étendre un affichage. Sur du matériel Windows il est donc possible d'étendre son bureau pour profiter d'un second espace où afficher ses logiciels ou ses applications. Sur un smartphone Android, on pourra lancer un film en dupliquant son écran sur sa télé et ainsi profiter d'un écran bien plus grand et d'un meilleur son.
    Mais ce sont loin d'être les seules possibilités offertes par le WiDi : les APIs multi-écrans étant ouvertes aux développeurs Android et Windows depuis les dernières moutures des deux systèmes, la plupart des usages restent à inventer et les possibilités sont très grandes !

    Intel a déjà quelques idées de nouveaux usages qui pourraient émerger :


    Lecture de vidéos multi-angles sur la TV, interface de contrôle sur la tablette.

    Navigateur web piloté depuis le smartphone, affichage des pages sur la TV.

    WiDi games
    Jeu en plein écran, contrôles sur le smartphone.

    Vous souhaitez savoir comment exploiter le WiDi dans vos applications Windows 8.1 ? Voyons cela dans un second article.

    Sources

    http://fr.wikipedia.org/wiki/Miracast
    http://www.matablettewindows.com/haswell-bay-trail-et-widi-intel-prepare-ses-armes-t16276.html

  • Intel WiDi
  • Icon Image: 

  • Product Documentation
  • User Experience and Design
  • Android*
  • Code for Good
  • Windows*
  • Laptop
  • Phone
  • Tablet
  • Desktop
  • Developers
  • Intel AppUp® Developers
  • Partners
  • Professors
  • Students
  • Android*
  • Microsoft Windows* 8
  • Viewing all 523 articles
    Browse latest View live




    Latest Images

    Vimeo 10.7.0 by Vimeo.com, Inc.

    Vimeo 10.7.0 by Vimeo.com, Inc.

    HANGAD

    HANGAD

    MAKAKAALAM

    MAKAKAALAM

    Doodle Jump 3.11.30 by Lima Sky LLC

    Doodle Jump 3.11.30 by Lima Sky LLC

    Doodle Jump 3.11.30 by Lima Sky LLC

    Doodle Jump 3.11.30 by Lima Sky LLC

    Vimeo 10.6.1 by Vimeo.com, Inc.

    Vimeo 10.6.1 by Vimeo.com, Inc.