1. IT-Security >
  2. Programmierung

ArabicEnglishFrenchGermanGreekItalianJapaneseKoreanPersianPolishPortugueseRussianSpanishTurkishVietnamese
Anzeige

Programmierung


Suchen

What’s New in Visual Studio 2019 | Microsoft Connect(); 2018

Programmierung vom 04.12.2018 um 17:09 Uhr | Quelle channel9.msdn.com

Visual Studio 2019 the latest improvements and features to support you and your team be more productive, support modern application development, and continuously innovate. In this video, we'll walk through how Visual Studio 2019 enables you to get started writing code faster and easier with existing code repos or new projects, focus on what's important with new UI improvements, seamlessly collaborate across teams with pull requests, and more!

Download Visual Studio 2019 Preview


Newsbewertung

Weiterlesen

Announcing ML.NET 0.8 – Machine Learning for .NET

Programmierung vom 04.12.2018 um 17:00 Uhr | Quelle blogs.msdn.microsoft.com

alt text

ML.NET is an open-source and cross-platform framework (Windows, Linux, macOS) which makes machine learning accessible for .NET developers.

ML.NET allows you to create and use machine learning models targeting scenarios to achieve common tasks such as sentiment analysis, issue classification, forecasting, recommendations, fraud detection, image classification and more. You can  check out these common tasks at our  GitHub repo with ML.NET samples.

Today we’re happy to announce the release of ML.NET 0.8. (ML.NET 0.1 was released at //Build 2018). This release focuses on adding improved support for recommendation scenarios, model explainability in the form of feature importance, debuggability by previewing your in-memory datasets, API improvements such as caching, filtering, and more.

This blog post provides details about the following topics in the ML.NET 0.8 release:

New Recommendation Scenarios (e.g. Frequently Bought Together)

alt text

Recommender systems enable producing a list of recommendations for products in a product catalog, songs, movies, and more. Products like Netflix, Amazon, Pinterest have democratized use of Recommendation like scenarios over the last decade.

ML.NET uses Matrix Factorization and  Field Aware Factorization machines based approach for recommendation which enable the following scenarios. In general Field Aware Factorization machines is the more generalized case of Matrix Factorization and allows for passing additional meta data.

With ML.NET 0.8 we have added another scenario for Matrix Factorization which enables recommendations.

Recommendation Scenarios Recommended solution Link to Sample
Product Recommendations based upon Product Id, Rating, User Id, and additional meta data like  Product Description, User Demographics (age, country etc.) Field Aware Factorization Machines Since ML.NET 0.3 (Sample here)
Product Recommendations based upon  Product Id, Rating and User Id only Matrix Factorization Since ML.NET 0.7 (Sample here)
Product Recommendations based upon Product Id and Co-Purchased Product IDs One Class Matrix Factorization New in ML.NET 0.8 (Sample here)

Yes! product recommendations are still possible even if you only have historical order purchasing data for your store.

This is a popular scenario as in many situations you might not have ratings available to you.

With historical purchasing data you can still build recommendations by providing your users a list of “Frequently Bought Together” product items.

The below snapshot is from Amazon.com where its recommending a set of products based upon the product selected by the user.

alt text

We now support this scenario in ML.NET 0.8, and you can try out this sample which performs product recommendations based upon an Amazon Co-purchasing dataset.

Improved debuggability by previewing the data

alt text

In most of the cases when starting to work with your pipeline and loading your dataset it is very useful to peek at the data that was loaded into an ML.NET DataView and even look at it after some intermediate transformation steps to ensure the data is transformed as expected.

First what you can do is to review schema of your DataView.
All you need to do is hover over IDataView object, expand it, and look for the Schema property.

alt text

If you want to take a look to the actual data loaded in the DataView, you can do following steps shown in the animation below.

alt text

The steps are:

  • While debugging, open a Watch window.
  • Enter variable name of you DataView object (in this case testDataView) and call Preview() method for it.
  • Now, click over the rows you want to inspect. That will show you actual data loaded in the DataView.

By default we output first 100 values in ColumnView and RowView. But that can be changed by passing the amount of rows you interested into the to Preview() function as argument, such as Preview(500).

Model explainability

alt text

In ML.NET 0.8 release, we have included APIs for model explainability that we use internally at Microsoft to help machine learning developers better understand the feature importance of models (“Overall Feature Importance”) and create high-capacity models that can be interpreted by others (“Generalized Additive Models”).

Overall feature importance gives a sense of which features are overall most important for the model. When creating Machine Learning models, it is often not enough to simply make predictions and evaluate its accuracy. As illustrated in the previous image, feature importance helps you understand which data features are most valuable to the model for making a good prediction. For instance, when predicting the price of a car, some features are more important like mileage and make/brand, while other features might impact less, like the car’s color.

The “Overall feature importance” of a model is enabled through a technique named “Permutation Feature Importance” (PFI). PFI measures feature importance by asking the question, “What would the effect on the model be if the values for a feature were set to a random value (permuted across the set of examples)?”.

The advantage of the PFI method is that it is model agnostic — it works with any model that can be evaluated — and it can use any dataset, not just the training set, to compute feature importance.

You can use PFI like so to produce feature importances with code like the following:

// Compute the feature importance using PFI
var permutationMetrics = mlContext.Regression.PermutationFeatureImportance(model, data);

// Get the feature names from the training set
var featureNames = data.Schema.GetColumns()
                .Select(tuple => tuple.column.Name) // Get the column names
                .Where(name => name != labelName) // Drop the Label
                .ToArray();

// Write out the feature names and their importance to the model's R-squared value
for (int i = 0; i < featureNames.Length; i++)
  Console.WriteLine($"{featureNames[i]}t{permutationMetrics[i].rSquared:G4}");

You would get a similar output in the console than the metrics below:

Console output:

    Feature            Model Weight    Change in R - Squared
    --------------------------------------------------------
    RoomsPerDwelling      50.80             -0.3695
    EmploymentDistance   -17.79             -0.2238
    TeacherRatio         -19.83             -0.1228
    TaxRate              -8.60              -0.1042
    NitricOxides         -15.95             -0.1025
    HighwayDistance        5.37             -0.09345
    CrimesPerCapita      -15.05             -0.05797
    PercentPre40s         -4.64             -0.0385
    PercentResidental      3.98             -0.02184
    CharlesRiver           3.38             -0.01487
    PercentNonRetail      -1.94             -0.007231

Note that in current ML.NET v0.8, PFI only works for binary classification and regression based models, but we’ll expand to additional ML tasks in the upcoming versions.

See the sample in the ML.NET repository for a complete example using PFI to analyze the feature importance of a model.

Generalized Additive Models, or (GAMs) have very explainable predictions. They are similar to linear models in terms of ease of understanding but are more flexible and can have better performance and and could also be visualized/plotted for easier analysis.

Example usage of how to train a GAM model, inspect and interpret the results, can be found here.

Additional API improvements in ML.NET 0.8

In this release we have also added other enhancements to our APIs which help with filtering rows in DataViews, caching data, allowing users to save data to the IDataView (IDV) binary format. You can learn about these features here.

Filtering rows in a DataView

alt text

Sometimes you might need to filter the data used for training a model. For example, you might need to remove rows where a certain column’s value is lower or higher than certain boundaries because of any reason like ‘outliers’ data.

This can now be done with additional filters like FilterByColumn() API such as in the following code from this sample app at ML.NET samples, where we want to keep only payment rows between $1 and $150 because for this particular scenario, because higher than $150 are considered “outliers” (extreme data distorting the model) and lower than $1 might be errors in data:

IDataView trainingDataView = mlContext.Data.FilterByColumn(baseTrainingDataView, "FareAmount", lowerBound: 1, upperBound: 150);

Thanks to the added DataView preview in Visual Studio previously mentioned above, you could now inspect the filtered data in your DataView.

Additional sample code can be check-out here.

Caching APIs

alt text

Some estimators iterate over the data multiple times. Instead of always reading from file, you can choose to cache the data to sometimes speed training execution.

A good example is the following when the training is using an OVA (One Versus All) trainer which is running multiple iterations against the same data. By eliminating the need to read data from disk multiple times you can reduce model training time by up to 50%:

var dataProcessPipeline = mlContext.Transforms.Conversion.MapValueToKey("Area", "Label")
        .Append(mlContext.Transforms.Text.FeaturizeText("Title", "TitleFeaturized"))
        .Append(mlContext.Transforms.Text.FeaturizeText("Description", "DescriptionFeaturized"))
        .Append(mlContext.Transforms.Concatenate("Features", "TitleFeaturized", "DescriptionFeaturized"))
        //Example Caching the DataView 
        .AppendCacheCheckpoint(mlContext) 
        .Append(mlContext.BinaryClassification.Trainers.AveragedPerceptron(DefaultColumnNames.Label,                                  
                                                                          DefaultColumnNames.Features,
                                                                          numIterations: 10));

This example code is implemented and execution time measured in this sample app at the ML.NET Samples repo.

An additional test example can be found here.

Enabled saving and loading data in IDataView (IDV) binary format for improved performance

alt text

It is sometimes useful to save data after it has been transformed. For example, you might have featurized all the text into sparse vectors and want to perform repeated experimentation with different trainers without continuously repeating the data transformation.

IDV format is a binary dataview file format provided by ML.NET.

Saving and loading files in IDV format is often significantly faster than using a text format because it is compressed.

In addtion, because it is already schematized ‘in-file’, you don’t need to specify the column types like you need to do when using a regular TextLoader, so the code to use is simpler in addition to faster.

Reading a binary data file can be done using this simple line of code:

mlContext.Data.ReadFromBinary("pathToFile");

Writing a binary data file can be done using this code:

mlContext.Data.SaveAsBinary("pathToFile");

Enabled stateful prediction engine for time series problems such as anomaly detection

alt text

ML.NET 0.7 enabled anomaly detection scenarios based on Time Series. However, the prediction engine was stateless, which means that every time you want to figure out whether the latest data point is anomolous, you need to provide historical data as well. This is unnatural.

The prediction engine can now keep state of time series data seen so far, so you can now get predictions by just providing the latest data point. This is enabled by using CreateTimeSeriesPredictionFunction() instead of CreatePredictionFunction().

Example usage can be found here

Get started!

alt text

If you haven’t already get started with ML.NET here.

Next, going further explore some other resources:

We will appreciate your feedback by filing issues with any suggestions or enhancements in the ML.NET GitHub repo to help us shape ML.NET and make .NET a great platform of choice for Machine Learning.

Thanks,

The ML.NET Team.

This blog was authored by Cesar de la Torre, Gal Oshri, Rogan Carr plus additional contributions from the ML.NET team


Newsbewertung

Weiterlesen

Device Simulation With Azure IoT

Video | Youtube vom 04.12.2018 um 14:45 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

bingbot Series: Getting most of Bingbot via Bing Webmaster Tools

Programmierung vom 04.12.2018 um 13:00 Uhr | Quelle blogs.bing.com

There are multiple features in Bing Webmaster Tools that allows webmasters to check Bingbot’s performance and issues on their site, provide input to Bingbot crawl schedules and check if that random bot hitting the pages frequently is actually Bingbot or not. 

In part 4 of our Bingbot series, Nikunj Daga, Program Manager for Bing Webmaster Tools, revisits some of those tools and features that assists webmasters in troubleshooting and optimizing Bingbot’s performance on their site. 

Crawl Information - Webmasters can get the data about Bingbot’s performance on their site in the Reports and Data section in Bing Webmaster Tools. The site activity chart present in this page can show an overlapping view of total pages indexed, pages crawled and pages where there were crawl errors for the last six months along with the Impressions and Clicks data. Through this chart, it is easier for webmasters to visually see whether the changes they made on their sites had any impact on page crawling.  

Further, in order to get more information on the pages with crawl errors, webmasters can go to the Crawl Information page. In this page, an aggregated count of the pages with different errors that Bingbot faced is provided along with the list of those URLs. This make it simple for webmasters to troubleshoot why a particular page that they are looking for in Bing is not appearing while searching. 

Crawl Errors – In addition to webmasters going and checking the Crawl Information page for crawl errors, Bing Webmaster Tools also proactively notifies the webmasters in case Bingbot faces significant number of issues while crawling the site. These notifications are sent on the Message Center in Bing Webmaster Tools. These alerts can also be sent through email to users who do not visit webmaster tools on a regular basis. So, it will not be a bad idea for webmasters to opt in for the email communication from Bing Webmaster Tools through the Profile Page.  Webmasters can set the preference for kind of alerts they want to receive emails for along with the preferred contact frequency. 

Further, Bingbot can face different kinds of errors while crawling a site, a detailed list of which along with their description and action can be found here

Crawl Control – The Crawl Control feature allows webmasters to provide input to the Bingbot about the crawl speed and timing for your site. It can be found under the “Configure My Site” section in Bing Webmaster Tools. Using this feature, you can set hourly crawl rate for your site and notify Bingbot to crawl slowly during peak business hours and faster during off peak hours. There are preset schedules to choose from based on the most common business hours followed across the globe. In addition to the preset schedules, webmasters also have the option to fully customize the crawl schedule based on their site’s traffic pattern. Customizing the crawl pattern is very easy and can be done by just dragging and clicking on the graph present in the Crawl Control feature. 

Fetch as Bingbot – The Fetch as Bingbot tool returns the code that Bingbot sees when it crawls the page. Webmasters can find this feature under the “Diagnostics and Tools” section to submit a request to fetch as Bingbot. Once the fetch is completed, the status will change from “Pending” to “Completed” and the webmasters will be able to see the code that appears to Bingbot when it tries to crawl the site. This is a useful feature for webmasters who use dynamic content on their sites and is the basic check if they want to see what data Bingbot sees among all the dynamic and static content on the site. 

Verify Bingbot - Found under the “Diagnostics and Tools” section in Bing Webmaster Tools, Verify Bingbot tool lets the webmasters check if the Bing user agent string appearing in the server logs are actually from Bing or not. This can help webmasters determine if someone is hiding their true identity and attacking the site by using Bing’s name. Further, it also helps webmasters who have manually configured IP to whitelist Bingbot on their server. Since Bing does not release the list of IPs, using this tool the webmasters can check whether the IPs allowed in the server belong to Bing and whether they are whitelisting the right set of IPs. 

Thus, it is evident that a lot can be done by webmasters to improve Bingbot’s performance on their site using the features in Bing Webmaster Tools. These features were developed and have evolved over the years based on feedback we receive from the webmaster community. So, login to Bing Webmaster Tools now to use the features and let us know what you think. 

Thanks!
Nikunj Daga
Program Manager, Bing Webmaster Tools


Newsbewertung

Weiterlesen

Channel 9 Video: Android-Emulatoren einrichten und verwalten

Programmierung vom 04.12.2018 um 09:00 Uhr | Quelle microsoft.com
Das Channel 9-Video aus der Xamarin-Show schildert, wie man mit dem aktuellen und vereinfachten Android Device Manager neue und bereits bestehende Android Emulator-Images einrichten und bearbeiten kann. ...
Newsbewertung

Weiterlesen

Channel 9 Video: Android-Emulatoren einrichten und verwalten

Programmierung vom 04.12.2018 um 09:00 Uhr | Quelle microsoft.com
Das Channel 9-Video aus der Xamarin-Show schildert, wie man mit dem aktuellen und vereinfachten Android Device Manager neue und bereits bestehende Android Emulator-Images einrichten und bearbeiten kann. ...
Newsbewertung

Weiterlesen

App Store Best of 2018

Programmierung vom 04.12.2018 um 06:00 Uhr | Quelle developer.apple.com
Congratulations to the developers featured in the App Store Best of 2018. This showcase honors our favorite apps and games on the App Store, and celebrates the amazing experiences that developers have created for Apple platforms.See all the featured apps and games
Newsbewertung

Weiterlesen

Rich Code Navigation

Programmierung vom 04.12.2018 um 01:00 Uhr | Quelle code.visualstudio.com

Newsbewertung

Weiterlesen

Die neue Generation YouTube – 40 Jahre nach "Video Killed the Radio Star"

Programmierung vom 04.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
Es sind gute Zeiten für Musikfans. Über Streamingplattformen kann man sich jeden beliebigen Song anhören und das überall, zu jeder Zeit und so oft man will.
Newsbewertung

Weiterlesen

Architecting Web Apps

Programmierung vom 04.12.2018 um 00:18 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Architecting Web Apps

Programmierung vom 04.12.2018 um 00:18 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Introduction to Data Manipulation and Visualizations in R || David Sung

Programmierung vom 03.12.2018 um 23:36 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Introduction to Data Manipulation and Visualizations in R || David Sung

Programmierung vom 03.12.2018 um 23:36 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Rust Quiz

Programmierung vom 03.12.2018 um 20:22 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

An Introduction to the Intelligent Kiosk

Video | Youtube vom 03.12.2018 um 19:06 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Using psql to \watch Star Wars and Other Silly Things!

Programmierung vom 03.12.2018 um 18:00 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

AzureVM: managing virtual machines in Azure

Programmierung vom 03.12.2018 um 17:00 Uhr | Quelle blog.revolutionanalytics.com

This is the next article in my series on AzureR, a family of packages for working with Azure in R. I’ll give a short introduction on how to use AzureVM to manage Azure virtual machines, and in particular Data Science Virtual Machines (DSVMs).

Creating a VM

Creating a VM is as simple as using the create_vm method, which is available as part of the az_subscription and az_resource_group classes.

library(AzureRMR)
library(AzureVM)

## using the subscription method
sub <- az_rm$
    new(tenant="{tenant_id}", app="{app_id}", password="{password}")$
    get_subscription("{subscription_id}")

myNewVM <- sub$create_vm("myNewVM",
    location="australiaeast",
    username="datascience",
    passkey="fAs30q-2a5vF!Z")  # be sure to choose a strong password!


## using the resource group method
rg <- sub$create_resource_group("myresourcegroup",
    location="australiaeast")

myOtherVM <- rg$create_vm("myOtherVM",
    username="datascience",
    passkey="l3Kgrf21%?0DFm")

Without any other options, this will create a Windows Server 2016 Data Science Virtual Machine, which is pre-installed with several tools useful for analytics: Python, R (and RStudio), Tensorflow, XGBoost, SQL Server, and so on. The size will be a Standard DS3 V2 VM, which has 4 cores, 14GB of memory, 1TB primary disk, and up to 16x28GB data disks.

A feature of creating the VM under a subscription, as opposed to a resource group, is that it creates a new resource group specifically to hold the VM. This simplifies the task of managing and (eventually) deleting the VM considerably. See “Deleting a VM” below.

You can change the specifications for the VM by providing any of the following arguments:

  • os: either “Windows” or “Ubuntu” (for Ubuntu LTS 16.04).
  • size: the VM size. Use the az_subscription$list_vm_sizes() method to see what sizes are available in your region. Note that in Azure, a VM’s size is really a broad-ranging label that encapsulates the number of cores, memory, and disk available.
  • passkey: if creating an Ubuntu VM, you can supply a public key as the passkey argument.
  • userauth_type: set this to “key” if you supply a public key.

Setting these will determine whether you get a Windows or Ubuntu DSVM, the login details, and how powerful the VM is in terms of cores and memory/disk capacity. For example, this will create a Linux NC-series DSVM using public key authentication:

myLinuxVM <- sub$create_vm("myLinuxVM",
    location="australiaeast",
    size="Standard_NC6s_v3",
    os="Ubuntu",
    username="datascience",
    passkey=readLines("~/id_rsa.pub"),
    userauth_type="key")

The list_vm_sizes() method will show you what the available VM sizes are for your region.

# examine VM sizes available in australiaeast region
sub$list_vm_sizes("australiaeast")

Retrieving an existing VM

If you have an existing VM, you can retrieve it with the get_vm() method. As with create_vm(), this is available as part of the az_subscription and az_resource_group classes. The only argument you need to supply is the name of the VM.

## using the subscription method
sub <- az_rm$
    new(tenant="{tenant_id}", app="{app_id}", password="{password}")$
    get_subscription("{subscription_id}")

# retrieve the VM we created above
myNewVM <- sub$get_vm("myNewVM")


## and with the resource group method
rg <- sub$get_resource_group("myresourcegroup")

myOtherVM <- rg$get_vm("myOtherVM")

Working with a VM

There are various things you can do with a VM object within R.

To stop (shutdown) a VM, call its stop() method. The deallocate argument sets whether to deallocate its resources as well, with the default being TRUE; you may want to set this to FALSE if you know you will be restarting the VM soon. To restart it, call either the start() or restart() method. The main difference between the two is that restart() will shutdown the VM first if it is currently running.

To sync the object with the resource in Azure, call the sync_vm_status() method. The most common situation where you might want to do this is if you create a VM with the argument wait=FALSE. In this case, rather than waiting for provisioning to complete, the create_vm method will return an incomplete VM object; you then call its sync_vm_status() method to update it with how the provisioning is going in Azure.

To dynamically resize a VM, call the resize() method with the new size. This has an optional deallocate argument controlling whether to stop and deallocate the VM first (which is sometimes necessary for successful resizing).

To run a script in the VM (without manually logging in), call the run_script() method. This will be a PowerShell script if it is a Windows VM, or a bash script if it is Linux. The script is just a character vector.

# simple bash script for executing on a Linux VM
script <-
'#!/bin/bash

var=Hello world!

# redirect output to a file so we know whether it ran successfully
echo "$var" > /tmp/helloworld.txt
'

vm$run_script(script)

If you login to the VM after running this script, you should find the file helloworld.txt in the /tmp directory.

Deleting a VM

Simply deleting a virtual machine object in R, eg with rm(), will not do anything to the VM itself in Azure. To delete the VM and its resources, call the object’s delete() method:

vm$delete()

AzureVM will prompt you for confirmation that you really want to delete the VM. By default, this will also free up all the individual Azure resources used by the VM, such as its storage, network interface, security group, and so on.

If you created the VM using the create_vm() method of the az_subscription class, the deletion process is very simple: it simply removes the resource group containing the VM. This is possible because the resource group was created as part of deploying the VM. Be aware that any other resources you may have created in this resource group will also be deleted.

Alternatively, you can use the delete_vm() method of the az_subscription and az_resource_group classes. These will retrieve the VM of the given name and then call its delete() method.

GPU enabled VMs

If you are running deep learning workloads, you’ll want to ensure that your VM is GPU-enabled. In Azure, the various NC- and ND-series VMs are designed for these workloads. You can create a GPU-enabled VM by setting the size argument appropriately, for example

myGpuVM <- sub$create_vm(size="Standard_NC12s_v3", ...)

However, the following caveats apply to GPU-enabled VMs: - Not all regions have GPUs available. To check on availability, use the az_subscription$list_vm_sizes() method and provide your region. - Currently, the supply of GPUs is limited. You must apply for a quota increase if you want to deploy a GPU-enabled VM.

VM clusters

You can work with VM clusters (a collection of VMs sharing the same configuration) by using the get_vm_cluster, create_vm_cluster and delete_vm_cluster methods. These are almost identical to get_vm, create_vm and delete_vm with the addition of a clust_size argument that sets the size of the cluster.

sub <- az_rm$
    new(tenant="{tenant_id}", app="{app_id}", password="{password}")$
    get_subscription("{subscription_id}")

# create a cluster of 5 Ubuntu VMs
vmCluster <- sub$create_vm_cluster("vmCluster",
    location="australiaeast",
    os="Ubuntu",
    username="datascience",
    passkey=readLines("~/id_rsa.pub"),
    userauth_type="key",
    clust_size=5)

Most things that you can do with a single VM, you can also do with a VM cluster. For example, running a script with the run_script() method will run the script on all the VMs in the cluster. Starting, stopping and restarting a cluster similarly carries out the given action on all the VMs.


Newsbewertung

Weiterlesen

Present more inclusively with live captions &#038; subtitles in PowerPoint

Programmierung vom 03.12.2018 um 15:00 Uhr | Quelle microsoft.com

Live presentations can be thought-provoking, inspirational, and powerful. A great presentation can inspire us to think about something in an entirely different way or bring a group together around a common idea or project. But not everyone experiences presentations in the same way. We may speak a different language from the presenter, or be a native speaker in another language, and some of us are deaf and hard of hearing. So, what if speakers could make their presentations better understood by everyone in the room? Now they can with live captions & subtitles in PowerPoint.

In honor of the United Nations International Day of Persons with Disabilities, we’re announcing this new feature—powered by artificial intelligence (AI)—which provides captions and subtitles for presentations in real-time. Live captions & subtitles in PowerPoint supports the deaf and hard of hearing community by giving them the ability to read what is being spoken in real-time. In addition, captions and subtitles can be displayed in the same language or in a different one, allowing non-native speakers to get a translation of a presentation. At launch, live captions & subtitles will support 12 spoken languages and display on-screen captions or subtitles in one of 60+ languages.

Live captions & subtitles in PowerPoint brings:

  • The power of AI to presenters, so they can convey simple and complex information across subjects and topics.
  • Speech recognition that automatically adapts based on the presented content for more accurate recognition of names and specialized terminology.
  • The ability for presenters to easily customize the size, position, and appearance of subtitles. Customizations may vary by platform.
  • A peace of mind with security and compliance knowing that the feature meets many industry standards for compliance certifications.

The feature joins other accessible features in Office 365, like automatic suggestions for alt-text in Word and PowerPoint, expanded availability of automatic closed captions and searchable transcripts for videos in Microsoft Stream, enhancements to the Office 365 Accessibility Checker, and more.

Here’s what one of our customers had to say:

“We are constantly looking for new ways of ensuring that the Government of Canada sets the highest possible standards as an accessible and inclusive workplace. We welcome such positive advances in technology, like this feature, that allows everyone, and notably those with disabilities, to better communicate ideas. They help break down barriers and lead to greater inclusiveness to the benefit of individuals and society as a whole.”
—Yazmine Laroche, deputy minister responsible for Public Service Accessibility

Live captions & subtitles in PowerPoint will begin rolling out in late January 2019 and will be available for Office 365 subscribers worldwide for PowerPoint on Windows 10, PowerPoint for Mac, and PowerPoint Online.

The post Present more inclusively with live captions & subtitles in PowerPoint appeared first on Microsoft 365 Blog.


Newsbewertung

Weiterlesen

Present more inclusively with live captions &#038; subtitles in PowerPoint

Programmierung vom 03.12.2018 um 15:00 Uhr | Quelle microsoft.com

Live presentations can be thought-provoking, inspirational, and powerful. A great presentation can inspire us to think about something in an entirely different way or bring a group together around a common idea or project. But not everyone experiences presentations in the same way. We may speak a different language from the presenter, or be a native speaker in another language, and some of us are deaf and hard of hearing. So, what if speakers could make their presentations better understood by everyone in the room? Now they can with live captions & subtitles in PowerPoint.

In honor of the United Nations International Day of Persons with Disabilities, we’re announcing this new feature—powered by artificial intelligence (AI)—which provides captions and subtitles for presentations in real-time. Live captions & subtitles in PowerPoint supports the deaf and hard of hearing community by giving them the ability to read what is being spoken in real-time. In addition, captions and subtitles can be displayed in the same language or in a different one, allowing non-native speakers to get a translation of a presentation. At launch, live captions & subtitles will support 12 spoken languages and display on-screen captions or subtitles in one of 60+ languages.

Live captions & subtitles in PowerPoint brings:

  • The power of AI to presenters, so they can convey simple and complex information across subjects and topics.
  • Speech recognition that automatically adapts based on the presented content for more accurate recognition of names and specialized terminology.
  • The ability for presenters to easily customize the size, position, and appearance of subtitles. Customizations may vary by platform.
  • A peace of mind with security and compliance knowing that the feature meets many industry standards for compliance certifications.

The feature joins other accessible features in Office 365, like automatic suggestions for alt-text in Word and PowerPoint, expanded availability of automatic closed captions and searchable transcripts for videos in Microsoft Stream, enhancements to the Office 365 Accessibility Checker, and more.

Here’s what one of our customers had to say:

“We are constantly looking for new ways of ensuring that the Government of Canada sets the highest possible standards as an accessible and inclusive workplace. We welcome such positive advances in technology, like this feature, that allows everyone, and notably those with disabilities, to better communicate ideas. They help break down barriers and lead to greater inclusiveness to the benefit of individuals and society as a whole.”
—Yazmine Laroche, deputy minister responsible for Public Service Accessibility

Live captions & subtitles in PowerPoint will begin rolling out in late January 2019 and will be available for Office 365 subscribers worldwide for PowerPoint on Windows 10, PowerPoint for Mac, and PowerPoint Online.

The post Present more inclusively with live captions & subtitles in PowerPoint appeared first on Microsoft 365 Blog.


Newsbewertung

Weiterlesen

F# Tooling Updates for Visual Studio 2017 | On .NET

Programmierung vom 03.12.2018 um 14:30 Uhr | Quelle channel9.msdn.com

In this episode, Phillip Carter (@_cartermp) joins us again to give us an update on the F# tooling updates in Visual Studio 2017. Because F# is cross-platform, many of the changes made are available in all F# tooling, not just that in Visual Studio.

  • [01:18] - Getting F# support in Visual Studio 2017
  • [03:23] - Diving into the editor features
  • [11:00] - Scripting and brace completion
  • [20:16] - Code navigation
  • [21:35] - Show symbols in unopened namespaces
  • [23:41] - Experimental CodeLens for F# type annotations
  • [26:41] - Providing feedback on features

 

Useful Links: 


Newsbewertung

Weiterlesen

F# Tooling Updates for Visual Studio 2017 | On .NET

Programmierung vom 03.12.2018 um 14:30 Uhr | Quelle channel9.msdn.com

In this episode, Phillip Carter (@_cartermp) joins us again to give us an update on the F# tooling updates in Visual Studio 2017. Because F# is cross-platform, many of the changes made are available in all F# tooling, not just that in Visual Studio.

  • [01:18] - Getting F# support in Visual Studio 2017
  • [03:23] - Diving into the editor features
  • [11:00] - Scripting and brace completion
  • [20:16] - Code navigation
  • [21:35] - Show symbols in unopened namespaces
  • [23:41] - Experimental CodeLens for F# type annotations
  • [26:41] - Providing feedback on features

 

Useful Links: 


Newsbewertung

Weiterlesen

Join the Twitter AMA with Azure Integration Services

Programmierung vom 03.12.2018 um 12:00 Uhr | Quelle azure.microsoft.com

Azure Integration Services will be hosting a joint Twitter Ask Me Anything (AMA), or actually “Ask Us Anything”, session for API Management, Logic Apps, Service Bus, and Event Grid on Thursday, December 6, 2018 from 8:30 AM to 10:00 AM Pacific Time.

Tweet to @AzureSupport using #IntegrationAMA with your questions.

What’s happening?

We’ll have members of the API Management, Logic Apps, Service Bus, and Event Grid product and engineering teams available to answer any and all questions on their services. You can also ask questions about roadmaps, new features, or pretty much anything else.

Why?

We want to learn more about what you’re interested in and how Azure Integration Services may be useful to you. We like gathering feedback from our users and the community, and your questions will help provide insights into how we can build better products and services for you.

What do you need to do?

Post your questions to Twitter using the hashtag #IntegrationAMA in a tweet to @AzureSupport. We’ll start taking questions 24 hours prior to the AMA, beginning at 8:30 AM Pacific Time on Wednesday, December 5, 2018, and then respond to them between 8:30 AM and 10:00 AM Pacific Time on Thursday, December 6, 2018. This is to allow people in different time zones to have their questions addressed, if they can’t attend the virtual event during the hours that we will be online.

If there are follow-ups or additional questions that come up after the AMA, no problem! We’re happy to continue the dialogue afterwards.

Should you ask questions here instead of StackOverflow, GitHub Issues, or MSDN?

An AMA is a great place to ask us about anything more informally, get answers directly from the team, and have a live conversation with the folks that build these products.

This AMA also serves as an interactive forum for questions related to product roadmap and more general questions on features or scenarios for Azure Integration Services. And yes – you can really ask us about anything .

We will continue to monitor StackOverflow, GitHub, and MSDN, per usual, to help you with issues you are facing.

Thank you and we look forward to "hearing" your questions at the AMA!


Newsbewertung

Weiterlesen

Modernize your Java Spring Boot application with Azure Database for MySQL

Programmierung vom 03.12.2018 um 10:00 Uhr | Quelle azure.microsoft.com

This blog post is co-authored by Parikshit Savjani, Senior Program Manager, Azure OSS Database service.

Spring is a well-known Java-based framework for building web and enterprise applications addressing the modern business needs. One of the advantages of using the Spring Boot framework is that it simplifies the data access from relational and NoSQL data stores. Spring Boot framework with MySQL Database backend is one of the established patterns to meet the online transactional processing needs of business applications. The modern business applications are built and deployed on cloud native microservice platforms like Azure Kubernetes service (AKS) moving away from traditional monolithic design to meet the elastic scale and portability needs. The databases on the other hand have more stateful requirements with atomicity, consistency, durability, resiliency, and zero data loss across failures. It is therefore more suited to run databases outside of Kubernetes environment on managed database services like Azure Database for MySQL service which meets these requirements.

Developers and customers can easily build and deploy their Java Spring Boot microservices application in Azure platform thereby improving developer productivity and enabling businesses to achieve more with the following solutions.

The following is a functional architecture sample of a Java Spring Boot microservices application called po-service on Azure. This Spring Boot application demonstrates how to build and deploy a purchase order microservice as a containerized application on Azure Kubernetes Service (AKS). The deployed microservice supports all CRUD operations on purchase orders.

Functional architecture sample of a Java Spring Boot microservices application called po-service

To enable and integrate the microservices application running on Azure Kubernetes services with the database running on Azure Database for MySQL service, developers can utilize and leverage Open Service Broker for Azure together with Kubernetes Service Catalog.

We have published detailed step-by-step instructions to build and deploy the above architecture in our GitHub repository. The overall goal of this step-by-step guide is:

  • To demonstrate the use of Open Service Broker for Azure to provision, deploy, and integrate Azure Database for MySQL from Azure Kubernetes Service seamlessly using the DevOps pipeline.
  • To demonstrate the use of Helm (CLI) for deploying containerized applications on Kubernetes (AKS). Helm is a package manager for Kubernetes and is a part of CNCF. Helm is used for managing Kubernetes packages called Charts.
  • To demonstrate how to secure a microservice (REST API) end-point using SSL/TLS (HTTPS transport) and expose it through the Ingress Controller addon on AKS.
  • To demonstrate the serverless container solution by deploying the microservice on Azure Container Instances (ACI).

Next steps

Get started with building and deploying your microservices application with managed database services on Azure today. You can leverage the instructions from the GitHub repository to build and deploy any microservices application on Azure Kubernetes service or Azure Container Instances (ACI). You can do this with databases running on fully managed Azure database services (PaaS) with end-to-end CI/CD platform running on Azure DevOps. We encourage the developer community to reuse it and raise issues or contribute back by sending a pull request.

Please continue to provide feedback on the features and functionality that you want to see next. If you need any help or have questions, please check out the Azure Database for MySQL documentation. Follow us on Twitter @AzureDBMySQL for the latest news and announcements.


Newsbewertung

Weiterlesen

Visual Studio App Center: Die Neuheiten der November-Version

Programmierung vom 03.12.2018 um 10:00 Uhr | Quelle microsoft.com
Visual Studio App Center wartet im November 2018 mit mehreren Neuheiten auf, die zu einem Großteil auf dem Feedback der Community beruhen. Die wichtigsten neuen Funktionen sind: Unity SDK-Unterstützung: Visual Studio App Center kann jetzt direkt im Unity-Editor installiert und verwaltet werden. Man muss nicht mehr zu GitHub navigieren, um die Pakete einzeln herunterzuladen und zu installieren. iOS App-Exte...
Newsbewertung

Weiterlesen

Azure.Source &#8211; Volume 60

Programmierung vom 03.12.2018 um 09:00 Uhr | Quelle azure.microsoft.com

Now in preview

Simplifying security for serverless and web apps with Azure Functions and App Service

New security features for Azure App Service and Azure Functions reduce the amount of code you need to work with identities and secrets under management. Key Vault references for Application Settings, User-assigned managed identities, and Managed identities for App Service on Linux/Web App for Containers are available in public preview. In addition, ClaimsPrincipal binding data for Azure Functions and support for Access-Control-Allow-Credentials in CORS config are now available. In addition, we’re continuing to invest in the Azure Security Center as the primary hub for security across your Azure resources, as it offers a fantastic way to catch and resolve configuration vulnerabilities, limit your exposure to threats, or detect attacks so you can respond to them.

Screenshot of Key Vault references for Application Settings (now in public preview)

Python package (PyPI) support for Azure Artifacts now in preview

Python package functionality within Azure Artifacts for publishing and consuming Python packages using Azure DevOps Services is currently in public preview. Now you can create a feed(s) associated with your project to store your packages; upload Python packages to your feed using twine, flit support is being tested; pull packages from your feed using pip; integrate Python packages into your Azure Pipelines CI/CD using a task that simplifies the authentication for you; and include packages from the public index into your feed (Upstreams). A tutorial is available for using Azure Artifacts to consume and publish Python packages using Azure DevOps Services, including assigning licenses and setup.

Also in preview

Get the latest updates: In preview

Now generally available

General availability: Zone-redundant SQL databases and elastic pools in additional regions

Azure SQL Database Premium tier supports multiple redundant replicas for each database that are automatically provisioned in the same datacenter within a region. Zone-redundant SQL single databases and elastic pools, are now generally available in two additional regions: West Europe and South-East Asia. The full list of supported regions includes: France Central, Central US, West Europe, and South-East Asia. The zone-redundant configuration is available to SQL databases and elastic pools in the Premium and Business Critical service tiers.

News and updates

Announcing Azure Dedicated HSM availability

The Microsoft Azure Dedicated Hardware Security Module (HSM) service provides cryptographic key storage in Azure and meets the most stringent customer security and compliance requirements. This service is the ideal solution for customers requiring FIPS 140-2 Level 3 validated devices with complete and exclusive control of the HSM appliance. The Azure Dedicated HSM service uses SafeNet Luna Network HSM 7 devices from Gemalto. This device offers the highest levels of performance and cryptographic integration options and makes it simple for you to migrate HSM-protected applications to Azure. The Azure Dedicated HSM is leased on a single-tenant basis.

Premium Block Blob Storage - a new level of performance

Premium Block Blob Storage, which is currently in limited public preview, unlocks a new level of performance in public cloud object storage. It uses a combination of solid-state drives in our storage clusters and enhancements to our blob storage software to provide high throughput and very fast response times. This blog post takes a closer look at some of these performance enhancements, such as low and consistent latency that was demonstrated to be up to 40 times better than Standard Blog Storage.

Chart comparing latency between Premium and Standard Blog Storage

SQL Server on Azure Virtual Machines resource provider

This post announced a new Resource Provider called Microsoft.SqlVirtualMachine, a management service running internally on Azure clusters to handle SQL Server-specific configurations and deployments on Azure VMs. SQL VM resource provider enables dynamic updates of SQL Server metadata and orchestrates multi-VM deployments required for SQL Server HADR architectures. SQL VM resource provider also enables SQL Server specific browse and monitoring experiences. The SQL VM resource provider introduces three new resource types: Microsoft.SqlVirtualMachine/SqlVirtualMachine, Microsoft.SqlVirtualMachine/SqlVirtualMachineGroup, and Microsoft.SqlVirtualMachine/Sql Virtual Machine Groups/Availability Group Listener.

Azure Hybrid Benefit for SQL Server on Azure Virtual Machines

Azure Hybrid Benefit (AHB) for SQL Server allows you to use on-premises licenses to run SQL Server on Azure Virtual Machines. If you have Software Assurance, you can use AHB when deploying a new SQL VM or activate SQL Server AHB for an existing SQL VM with a pay as you go (PAYG) license. Now you can activate SQL Server AHB on Azure VM with SQL VM Resource Provider described in the post above. With the new Microsoft. SqlVirtualMachine resource provider you can manage SQL server configurations on Azure VMs dynamically. Flexible SQL Server License type configuration is the first feature we are delivering with SQL VM resource provider, and it enables instant and significant cost savings for SQL VM.

The Green Team solves high-risk, systemic security issues for Microsoft Azure

The Assume Breach security strategy assumes security breaches will occur instead of focusing solely on preventing breaches. Since 2009, two groups within Microsoft, the Red Team (attackers) routinely attacks Azure to discover security holes and the Blue Team (defenders) sets up honey pots and works to detect any attack. The Green Team consists of dedicated resources focusing on remediation and solving classes of high-risk and systemic security vulnerabilities for the Azure platform. The Green Team works closely with the Red and Blue Teams to understand what high-risk, systemic security issues exist – specifically focusing in on those that enable or lead to breaches – and by performing root cause analysis identify and address these issues at scale. The team continuously implements the latest best practices to help secure the Azure platform and help protect customer data and workloads. Read this post to learn how the Green Team contributes to Microsoft’s Assume Breach evolution while striving for Simply Secure.

Additional news and updates

Azure shows

Episode 256 - Living in a Serverless world | The Azure Podcast

Cynthia, Cale and Evan have a stirring discussion on the use-cases for Serverless computing and Azure Functions. They dive into scenarios when it is a good idea to use them and when it is not.

Azure Container Registry Tasks: Build and deploy to Azure App Service | Azure Friday (500th episode!)

Steve Lasker joins Scott Hanselman to talk about Azure Container Registry (ACR) Tasks and how you can build your container images in Azure for the three phases of development: pre-commit, team commits, and post-development for OS & Framework Patching.

Track my Pizza Cat van with Azure IoT solution accelerators | Internet of Things Show

Oh no! Pizza cat is having a hard time knowing if his pizzas are being delivered purr-fectly. Customers have been complaining about cold pizzas being delivered to the wrong houses! Come see how Pizza Cat uses a Remote Monitoring solution to save his Pizza company.

SmartHotel 360, a demo powered by Azure Digital Twins | Internet of Things Show

Here is an example of a smart hotel solution built on Azure Digital Twins. In this episode of the IoT Show, Lyrana Hughes shows how the core spatial intelligence capabilities of Azure Digital Twins power the Smart Hotel 360 demo and shares where you can access the demo content on GitHub so you can start building your own solution.

Introducing the Azure Blockchain Development Kit | Block Talk

In this episode we introduce the Azure Blockchain Development Kit, highlighting new samples that show case three key themes – Connect – Connect users, organizations, and devices to blockchain solutions, highlighting IoT, SMS, and Bots; Integrate – Integrate to existing legacy systems and protocols, highlighting legacy (FTP, Flat File) and media; Deploy – DevOps for blockchain using Azure DevOps and OSS tools for Truffle. Highlighting dev, test, and build pipelines.

Getting started with Key Management Concepts | Block Talk

This video and demonstration provides a look at the core concepts around cryptographic key and key management, as well as how they apply to blockchain based technology. The topics covered include core key fundamentals (asymmetric) used by Ethereum and a demo showing the technical details around how they apply to blockchain.

Azure ML Data Prep GUI, It's Not Just About The Code | AI Show

While lots of people like to do their data prep in code some tasks are faster and more easily done in a GUI, what's even better is a set of capabilities where you can pick and choose when and how to work in code and when to work in a GUI that work together. This show will demonstrate how we make Seth's life easier and faster in terms of data prep, allowing him to focus his nerdiness on modelling.

How to build a home automation auto-away assist with Azure IoT Hub | Azure Makers Series

Get more out of your home automation setup with Azure IoT Hub and Azure Functions. See how you can let your smart thermostat know when you’re in another room (not truly away) using motion sensors, Particle.io, and Azure.

Thumbnail from How to build a home automation auto-away assist with Azure IoT Hub from the Azure Makers Series on YouTube

How to edit an existing API Connection with Azure Logic Apps | Azure Tips and Tricks

Learn how to modify an existing API Connection with Azure Logic Apps. If you want to edit an existing API connection, all you have to do is simply type "API Connections" and select the "API Connections" menu item to get started.

Thumbnail from How to edit an existing API Connection with Azure Logic Apps from Azure Tips and Tricks on YouTube

Henry Been on Security with DevOps - Episode 012 | The Azure DevOps Podcast

Jeffrey is discussing security in DevOps with his guest, Henry Been. Henry offers advice on how to implement security into your DevOps practice, makes recommendations on how to be more secure at each stage of the software development application lifecycle, highlights possible vulnerabilities that you might want to watch out for, and offers tools you can utilize to combat this and up your security in your DevOps environment.

Technical content

Running Cognitive Service containers

Recently, we announced a preview of Docker support for Microsoft Azure Cognitive Services with an initial set of containers ranging from Computer Vision and Face, to Text Analytics. This blog post focuses on trying things out, firing up a Cognitive Service container, and seeing what it can do using Docker Desktop. Later blog posts will explore using Azure Kubernetes Service and Azure Service Fabric.

Considering Azure Functions for a serverless data streaming scenario

An earlier blog post, A fast, serverless, big data pipeline powered by a single Azure Function, discussed a fraud detection solution delivered to a banking customer. This solution required complete processing of a streaming pipeline for telemetry data in real-time using a serverless architecture. This blog post describes the evaluation process and the decision to use Azure Functions, which is easy to configure and within minutes can be set up to consume massive volumes of telemetry data from Azure Event Hubs.

Diagram of the workflow that begins with data streaming into a single instance of Event Hubs, which is then consumed by a single Azure Function

Azure Cosmos DB and multi-tenant systems

Learn how to build a multi-tenant system on Azure Cosmos DB, which itself is a multi-tenant PaaS offering on Microsoft Azure. Building a multi-tenant system on another multi-tenant system can be challenging, but Azure provides us all the tools to make our task easy. A key actor in this solution is an Azure Managed Application, which enables you to offer cloud solutions that are easy for consumers to deploy and operate. In a managed application, the resources are provisioned in a resource group that is managed by the publisher of the app. The resource group is present in the consumer's subscription, but an identity in the publisher's tenant has access to the resource group in the customer subscription. The publisher application, which manages the customer data, is hosted in a different Azure Active Directory tenant and subscription, which is separate from that of the customer’s tenant and data.

Flow chart showing front-end service interaction with the customer subscription resources

Improving Azure Virtual Machine resiliency with predictive ML and live migration

Starting earlier this year, Azure has been using live migration in response to a variety of failure scenarios such as hardware faults, as well as regular fleet operations like rack maintenance and software/BIOS updates. Our initial use of live migration to handle failures gracefully allowed us to reduce the impact of failures on availability by 50 percent. We partnered with Microsoft Research (MSR) on building our ML models that predict failures with a high degree of accuracy before they occur. As a result, we’re able to live migrate workloads off “at-risk” machines before they ever show any signs of failing. Read this post to learn more about how this means VMs running on Azure can be more reliable than the underlying hardware.

Time series analysis in Azure Data Explorer

Azure Data Explorer (ADX) is a lightning fast service optimized for data exploration. It supplies users with instant visibility into very large raw datasets in near real-time to analyze performance, identify trends and anomalies, and diagnose problems. This blog post describes the basics of time series analysis in Azure Data Explorer, which performs on-going collection of telemetry data from cloud services or IoT devices. This data can be analyzed for various insights such as monitoring service health, physical production processes, and usage trends. Analysis is done on time series of selected metrics to find a deviation in the pattern compared to its typical baseline pattern.

Screenshot of chart showing the Top 2 periodic decreasing web service traffic

Additional technical content

Events

Microsoft Connect(); 2018

Save the date to tune in online tomorrow, Tuesday, December 4, 2018 for Microsoft Connect – a full day of dev-focused delight—including updates on Azure and Visual Studio, keynotes, demos, and real-time coding with experts. Whether you’re just getting started or you’ve been around the blockchain, you’ll find your people here. And it all happens online. Get comfortable, and get inspired.

Save the date for Microsoft Connect(); 2018

Join us on November 28 for our next meetup: Adopting Emerging Tech in Government

At the last Microsoft Azure Government DC meetup, we discussed the leading edge of emerging technology in government, including how agencies are approaching strategy, challenges, use cases, and workforce readiness as they leverage emerging tech to innovate for their mission including blockchain, artificial intelligence, machine learning, and augmented reality. Check out the Microsoft Azure Government DC YouTube channel later this week for on-demand videos of this meetup and past ones.

Customers and partners

Customers are using Azure Stack to unlock new hybrid cloud innovation

We’re seeing high interest and adoption of Azure Stack across a number of industries – manufacturing, financial services, healthcare, and state & local governments. This makes perfect sense, as these industries have some of the most stringent regulatory requirements, often require operations in areas with limited or no internet connectivity, and typically have some legacy applications. This post looks at a few ways our customers in these industries are using Azure Stack today to address these real-world challenges. Customers across many industries are realizing the benefits of a truly consistent hybrid cloud with Azure Stack.

Three reasons why Windows Server and SQL Server customers continue to choose Azure

For the past 25 years, companies of every size have trusted Windows Server and SQL Server to run their business-critical workloads. As more customers use the cloud for innovation and digital transformation, the first step is often migrating existing Windows Server and SQL Server applications and data to the cloud. This post looks at the three main reasons we hear why customers choose to stay with Microsoft when they move to the cloud: Pay less with Azure, Azure delivers unmatched security and compliance, and Azure is the only consistent hybrid cloud.

Using AI and IoT for disaster management

Natural disasters caused by climate change, extreme weather, and aging and poorly designed infrastructure, among other risks, represent a significant risk to human life and communities. National, state, and local governments and organizations are also grappling with how to update disaster management practices to keep up. In this blog post, learn how the Internet of Things (IoT), artificial intelligence (AI), and machine learning can help. Not every crisis is avoidable, but we now have the technology to predict and prevent catastrophes such as oil spills or building collapses. When unpredictable natural disasters do strike, responders can gain access to real-time data that aims aid where it needs to be faster, reducing additional loss of life.


Azure This Week – 30 November 2018 | A Cloud Guru

This time on Azure This Week, Lars talks about Azure DevOps on-premises version now in Release Candidate. He also discusses the public preview of simplifying confidential computing in Azure IoT Edge, and gives details on how you can join the online Microsoft Connect(); event tomorrow.

Thumbnail from Azure This Week - 30 November 2018 by A Cloud Guru on YouTube


Newsbewertung

Weiterlesen

Mehr Nutzer erreichen: 4 Tipps für barrierefreie Apps und Websites

Programmierung vom 03.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
Studien zufolge haben etwa 15 % der Weltbevölkerung eine Behinderung. Das sind mehr als eine Milliarde Menschen.
Newsbewertung

Weiterlesen

Mobile Speed: Diese 6 deutschen Webseiten übertreffen die Nutzererwartung von 3 Sekunden Ladezeit

Programmierung vom 03.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
Mobile Speed ist und bleibt wichtig: Ob Nutzer eine Webseite überhaupt sehen, mit ihr interagieren oder diese schnell wieder verlassen, hängt stark von der Ladegeschwindigkeit ab. Gemeinsam mit der Mobile Marketing Association (MMA) Germany veröffentlicht Google Leaderboards, die die mobile Ladegeschwindigkeit bekannter Anbieter vergleichen – inzwischen zum dritten Mal .
Newsbewertung

Weiterlesen

Auf einen Kaffee mit Ingrid Hochwind und Stefan Hofmann: "Video ist das präferierte Format für Beauty-Inspiration"

Programmierung vom 03.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
In einer weiteren Ausgabe unserer “Auf einen Kaffee mit”-Reihe beantworten die Marketing-Spezialisten Ingrid Hochwind und Stefan Hofmann Fragen rund um den Beauty-Sektor zur Weihnachtszeit. Sie erklären nicht nur, wie die das typische Such- und Kaufverhalten aussieht, sondern geben darüber hinaus Händlern hilfreiche Tipps, wie sie optimal auf diese wichtige Jahreszeit vorbereitet sind.
Newsbewertung

Weiterlesen

Überraschende Erkenntnisse zu mobilen Apps

Programmierung vom 03.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
Mobile Apps sind aus unserem Alltag nicht mehr wegzudenken. Sie werden von 92 % der Smartphonebesitzer verwendet. Laut einer neuen Studie von Ipsos lassen sich durch gut gemachte Werbung in mobilen Apps Entscheidungsträger erreichen und Aktionen von Nutzern fördern.
Newsbewertung

Weiterlesen

Dank datengetriebener Attribution die richtigen Keywords finden

Programmierung vom 03.12.2018 um 01:00 Uhr | Quelle thinkwithgoogle.com
Die Customer Journey ist so komplex wie nie zuvor ‒ Kunden wechseln oftmals zwischen Online-Kanälen ehe sie einen Kauf abschließen. Das hat auch Auswirkungen auf die Messung der verschiedenen Touchpoints. Attributionsmodelle wie beispielsweise die “Last-Click-Attribution” gelten schon lange als nicht mehr zeitgemäß. Ein wesentlich genaueres Bild der Customer Journey gewinnen Werbetreibende stattdessen, wenn sie ganzheitliche Modelle wie datengetriebene Attribution nutzen und so zusätzliche Potenziale in Bereichen aufdecken, die vorher zu wenig Beachtung erhalten haben (z. B. generische Keywords oder mobile Klicks).
Newsbewertung

Weiterlesen

Functional Programming Fundamentals

Programmierung vom 03.12.2018 um 00:59 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Can I Has Grammar?

Programmierung vom 02.12.2018 um 20:00 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

How to Crack the Product Manager Interview

Programmierung vom 02.12.2018 um 01:16 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

What's New in CSS

Programmierung vom 01.12.2018 um 21:02 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Learning to Love Type Systems

Programmierung vom 01.12.2018 um 19:35 Uhr | Quelle youtube.com

Newsbewertung

Weiterlesen

Seitennavigation

Seite 6 von 38 Seiten (Bei Beitrag 175 - 210)
1.316x Beiträge in dieser Kategorie

Auf Seite 5 zurück | Nächste 7 Seite | Letzte Seite
[ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [6] [ 7 ] [ 8 ] [ 9 ] [ 10 ] [ 11 ] [ 12 ] [ 13 ] [ 14 ] [ 15 ] [ 16 ]