The Applications Prototype Lab at ESRI have created a few proof-of-concepts for the Surface using the ArcGIS API for WPF. Examples of which can be seen here and here.
Surface applications are essentially WPF applications running in full screen mode. When a user interacts with an application on a Surface device is not through the standard mouse API but through a specific Surface API. The code below describes how to Surface-enable the map control that comes with the ArcGIS API for WPF. This is just a basic implementation and should be treated as a developer sample rather than anything official.
SurfaceMap.xaml
After installing the Microsoft Surface SDK 1.0 (or 1.1) you will see a few extra items in the new projects dialog of Microsoft Visual Studio 2008. Select Surface Application, add a reference to the ArcGIS API for WPF assemblies and then add a new resource file called SurfaceMap.xaml. Ensure that the build action is set to page.
<ResourceDictionary xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:local="clr-namespace:ESRI.PrototypeLab.Surface.Ccm" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:s="http://schemas.microsoft.com/surface/2008" xmlns:esri="clr-namespace:ESRI.ArcGIS.Client;assembly=ESRI.ArcGIS.Client" > <Style TargetType="{x:Type local:SurfaceMap}"> <Setter Property="Template"> <Setter.Value> <ControlTemplate TargetType="{x:Type local:SurfaceMap}"> <Border
Background="{TemplateBinding Background}" BorderBrush="{TemplateBinding BorderBrush}" BorderThickness="{TemplateBinding BorderThickness}" Width="{TemplateBinding Width}" Height="{TemplateBinding Height}" > <esri:Map Name="PART_map"
PanDuration="0"
ZoomDuration="0" /> </Border> </ControlTemplate> </Setter.Value> </Setter> </Style> </ResourceDictionary>
App.xaml
Next you need to inform the compiler to load SurfaceMap.xaml as a resource. To do so, add an entry to the App.xaml file as shown below.
<Application x:Class="ESRI.PrototypeLab.Surface.Ccm.App" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" StartupUri="SurfaceWindow1.xaml" > <Application.Resources> <ResourceDictionary> <ResourceDictionary.MergedDictionaries> <ResourceDictionary Source="...generic.xaml"/> <ResourceDictionary
Source="/ESRI.PrototypeLab.Surface.Ccm;component/SurfaceMap.xaml"/> </ResourceDictionary.MergedDictionaries> </ResourceDictionary> </Application.Resources> </Application>
SurfaceMap.cs
Add a new code file called SurfaceMap.cs and make sure that the build action is set to compile. The code below contains the bulk of the logic to interpret contact (i.e. finger) manipulation. The two key points to note is the manipulation processor and the inertia processor. The manipulation processor will raise events when users manipulate performs a panning or zooming action. The inertia processor is used to continue panning if the user flicks map.
using System; using System.Windows; using System.Windows.Input; using ESRI.ArcGIS.Client; using ESRI.ArcGIS.Client.Geometry; using Microsoft.Surface.Presentation; using Microsoft.Surface.Presentation.Controls; using Microsoft.Surface.Presentation.Manipulations; using Microsoft.Surface; namespace ESRI.PrototypeLab.Surface.Ccm { public partial class SurfaceMap : SurfaceUserControl { private Map m_map = null; private Affine2DManipulationProcessor
m_manipulationProcessor = null; private Affine2DInertiaProcessor m_inertiaProcessorMove = null; static SurfaceMap() { FrameworkElement.DefaultStyleKeyProperty.OverrideMetadata(
typeof(SurfaceMap),
new FrameworkPropertyMetadata(typeof(SurfaceMap))); } public SurfaceMap() : base() { // Define manipulation processor used or panning and Zooming this.m_manipulationProcessor =
new Affine2DManipulationProcessor( Affine2DManipulations.TranslateX | Affine2DManipulations.TranslateY | Affine2DManipulations.Scale, this, true); this.m_manipulationProcessor.Affine2DManipulationStarted +=
new EventHandler<Affine2DOperationStartedEventArgs>
(this.ManipulationProcessor_Affine2DManipulationStarted); this.m_manipulationProcessor.Affine2DManipulationDelta +=
new EventHandler<Affine2DOperationDeltaEventArgs>
(this.ManipulationProcessor_Affine2DManipulationDelta); this.m_manipulationProcessor.Affine2DManipulationCompleted +=
new EventHandler<Affine2DOperationCompletedEventArgs>
(this.ManipulationProcessor_Affine2DManipulationCompleted); // Define inertia proceesor for panning this.m_inertiaProcessorMove = new Affine2DInertiaProcessor(); this.m_inertiaProcessorMove.Affine2DInertiaDelta +=
new EventHandler<Affine2DOperationDeltaEventArgs>
(this.InertiaProcessorMove_Affine2DInertiaDelta); this.m_inertiaProcessorMove.Affine2DInertiaCompleted +=
new EventHandler<Affine2DOperationCompletedEventArgs>
(this.InertiaProcessorMove_Affine2DInertiaCompleted); } public override void OnApplyTemplate() { base.OnApplyTemplate(); this.m_map = (Map)this.Template.FindName("PART_map", this); } public Map Map { get { return this.m_map; } set { this.m_map = value; } } protected override void OnContactDown(ContactEventArgs e) { base.OnContactDown(e); if (!e.Contact.IsFingerRecognized) { return; } e.Contact.Capture(this, CaptureMode.SubTree); this.m_manipulationProcessor.BeginTrack(e.Contact); e.Handled = true; } protected override void OnContactChanged(ContactEventArgs e) { base.OnContactChanged(e); if (!e.Contact.IsFingerRecognized) { return; } Point position = e.Contact.GetPosition(this); if (position.X < 0 || position.Y < 0 || position.X > this.ActualWidth || position.Y > this.ActualHeight) { e.Contact.Capture(this, CaptureMode.None); e.Handled = true; return; } e.Contact.Capture(this, CaptureMode.SubTree); this.m_manipulationProcessor.BeginTrack(e.Contact); e.Handled = true; } protected override void OnContactUp(ContactEventArgs e) { base.OnContactUp(e); if (!e.Contact.IsFingerRecognized) { return; } e.Contact.Capture(this, CaptureMode.None); } private void ManipulationProcessor_Affine2DManipulationStarted(
object sender, Affine2DOperationStartedEventArgs e) { if (this.m_inertiaProcessorMove.IsRunning) { this.m_inertiaProcessorMove.End(); } } private void ManipulationProcessor_Affine2DManipulationDelta(
object sender, Affine2DOperationDeltaEventArgs e) { if (this.m_map == null) { return; } if (this.m_map.Extent == null) { return; } if ((e.Delta.X == 0) &&
(e.Delta.Y == 0) &&
(e.ScaleDelta == 0)) { return; } if ((e.Delta.X != 0) || (e.Delta.Y != 0)) { MapPoint center = this.m_map.Extent.GetCenter(); Point screen = this.m_map.MapToScreen(center); Point newScreen; switch(ApplicationLauncher.Orientation){ case UserOrientation.Top: // Surface is upside-down newScreen = Point.Add(screen, e.Delta); break; case UserOrientation.Bottom: // Surface has normal orientation newScreen = Point.Subtract(screen, e.Delta); break; default: return; } MapPoint newCenter = this.m_map.ScreenToMap(newScreen); this.m_map.PanTo(newCenter); Window w = SurfaceWindow.GetWindow(this); } if (e.ScaleDelta != 0) { if (Contacts.GetContactsCaptured(this).Count > 1) { this.m_map.ZoomToResolution(
this.m_map.Resolution / e.ScaleDelta); } } } private void ManipulationProcessor_Affine2DManipulationCompleted(
object sender, Affine2DOperationCompletedEventArgs e) { this.m_inertiaProcessorMove.InitialOrigin = new Point(0, 0); this.m_inertiaProcessorMove.InitialVelocity = e.Velocity * 10; this.m_inertiaProcessorMove.DesiredDeceleration = 0.005; this.m_inertiaProcessorMove.Begin(); } private void InertiaProcessorMove_Affine2DInertiaDelta(
object sender, Affine2DOperationDeltaEventArgs e) { if ((e.Velocity.X == 0) && (e.Velocity.Y == 0)) { return; } MapPoint center = this.m_map.Extent.GetCenter(); Point screen = this.m_map.MapToScreen(center); Point newScreen; switch (ApplicationLauncher.Orientation) { case UserOrientation.Top: // Surface is upside-down newScreen = Point.Add(screen, e.Velocity); break; case UserOrientation.Bottom: // Surface has normal orientation newScreen = Point.Subtract(screen, e.Velocity); break; default: return; } MapPoint newCenter = this.m_map.ScreenToMap(newScreen); this.m_map.PanTo(newCenter); } private void InertiaProcessorMove_Affine2DInertiaCompleted(
object sender, Affine2DOperationCompletedEventArgs e) { } } }
SurfaceWindow1.xaml
The snippet below demonstrates how to add a reference to the SurfaceMap defined in the snippets above.
<s:SurfaceWindow x:Class="ESRI.PrototypeLab.Surface.Ccm.SurfaceWindow1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:s="http://schemas.microsoft.com/surface/2008" xmlns:local="clr-namespace:ESRI.PrototypeLab.Surface.Ccm" Title="my Title" Height="768" Width="1024" Loaded="SurfaceWindow_Loaded" > <Grid> <local:SurfaceMap x:Name="surfaceMap" /> </Grid> </s:SurfaceWindow>
SurfaceWindow1.xaml.cs (code behind)
In the code-behind of the Surface window, the map extent is defined and the ArcGIS Online world street map layer is added.
private void SurfaceWindow_Loaded(object sender, RoutedEventArgs e) { // Add the ArcGIS Online street map ArcGISTiledMapServiceLayer streets = new ArcGISTiledMapServiceLayer(){ ID = "street", Url = "http://server.arcgisonline.com/ArcGIS/rest/services" +
"/ESRI_StreetMap_World_2D/MapServer", Opacity = 1 }; this.surfaceMap.Map.Extent =
new Envelope(-116.992383, 33.126732, -116.560400, 32.797143); this.surfaceMap.Map.Layers.Add(streets); // Listen to map events this.surfaceMap.Map.ExtentChanging +=
new EventHandler<ExtentEventArgs>(this.Map_ExtentChanging); this.surfaceMap.Map.ExtentChanged +=
new EventHandler<ExtentEventArgs>(this.Map_ExtentChanged); }
This concludes this tutorial on adding Surface support to the ArcGIS API for WPF map control. For more information, please visit the ArcGIS API for Silverlight/WPF home page, support forum and resource center.