Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Free IT Exam Dumps

AI-100 Dump Free

Table of Contents

Toggle
  • AI-100 Dump Free – 50 Practice Questions to Sharpen Your Exam Readiness.
  • Access Full AI-100 Dump Free

AI-100 Dump Free – 50 Practice Questions to Sharpen Your Exam Readiness.

Looking for a reliable way to prepare for your AI-100 certification? Our AI-100 Dump Free includes 50 exam-style practice questions designed to reflect real test scenarios—helping you study smarter and pass with confidence.

Using an AI-100 dump free set of questions can give you an edge in your exam prep by helping you:

  • Understand the format and types of questions you’ll face
  • Pinpoint weak areas and focus your study efforts
  • Boost your confidence with realistic question practice

Below, you will find 50 free questions from our AI-100 Dump Free collection. These cover key topics and are structured to simulate the difficulty level of the real exam, making them a valuable tool for review or final prep.

Question 1

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement Azure Active Directory (Azure AD) integration on the storage account.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

Azure Active Directory (Azure AD) integration for blobs, and queues provides Azure role-based access control (Azure RBAC) for control over a client’s access to resources in a storage account.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 2

Your company uses an internal blog to share news with employees.
You use the Translator Text API to translate the text in the blog from English to several other languages used by the employees.
Several employees report that the translations are often inaccurate.
You need to improve the accuracy of the translations.
What should you add to the translation solution?

A. Text Analytics

B. Language Understanding (LUIS)

C. Azure Media Services

D. Custom Translator

 


Suggested Answer: D

Custom Translator is a feature of the Microsoft Translator service. With Custom Translator, enterprises, app developers, and language service providers can build neural translation systems that understand the terminology used in their own business and industry. The customized translation system will then seamlessly integrate into existing applications, workflows and websites.
Custom Translator allows users to customize Microsoft Translator’s advanced neural machine translation for Translator’s supported neural translation languages.
Custom Translator can be used for customizing text when using the Microsoft Translator Text API , and speech translation using the Microsoft Speech services.
References:
https://www.microsoft.com/en-us/translator/business/customization/

Question 3

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement Shared Access Signatures (SAS) on the storage account.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

Shared access signatures (SAS) provide limited delegated access to resources in a storage account. Adding constraints on the time interval for which the signature is valid or on permissions it grants provides flexibility in managing access.
Use Azure Active Directory (Azure AD) instead.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 4

You have an existing Language Understanding (LUIS) model for an internal bot.
You need to recommend a solution to add a meeting reminder functionality to the bot by using a prebuilt model. The solution must minimize the size of the model.
Which component of LUIS should you recommend?

A. domain

B. intents

C. entities

 


Suggested Answer: C

LUIS includes a set of prebuilt entities for recognizing common types of information, like dates, times, numbers, measurements, and currency. Prebuilt entity support varies by the culture of your LUIS app.
Note: LUIS provides three types of prebuilt models. Each model can be added to your app at any time.
Model type: Includes –
✑ Domain: Intents, utterances, entities
✑ Intents: Intents, utterances
✑ Entities: Entities only
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-prebuilt-model

Question 5

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several API models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You create environment files.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

Question 6

You have deployed several Azure IoT Edge devices for an AI solution. The Azure IoT Edge devices generate measurement data from temperature sensors.
You need a solution to process the sensor data. Your solution must be able to write configuration changes back to the devices.
You make use of Microsoft Azure IoT Hub.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

Reference:
https://azure.microsoft.com/en-us/resources/samples/functions-js-iot-hub-processing/

Question 7

You have an AI application that uses keys in Azure Key Vault.
Recently, a key used by the application was deleted accidentally and was unrecoverable.
You need to ensure that if a key is deleted, it is retained in the key vault for 90 days.
Which two features should you configure? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. The expiration date on the keys

B. Soft delete

C. Purge protection

D. Auditors

E. The activation date on the keys

 


Suggested Answer: BC

References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

Question 8

You are designing an AI application that will perform real-time processing by using Microsoft Azure Stream Analytics.
You need to identify the valid outputs of a Stream Analytics job.
What are three possible outputs? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. A Hive table in Azure HDInsight

B. Azure SQL Database

C. Azure Cosmos DB

D. Azure Blob storage

E. Azure Redis Cache

 


Suggested Answer: BCD

References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs

Question 9

HOTSPOT -
You plan to create an intelligent bot to handle internal user chats to the help desk of your company. The bot has the following requirements:
✑ Must be able to interpret what a user means.
✑ Must be able to perform multiple tasks for a user.
Must be able to answer questions from an existing knowledge base.
 Image
You need to recommend which solutions meet the requirements.
Which solution should you recommend for each requirement? To answer, drag the appropriate solutions to the correct requirements. Each solution may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: The Language Understanding (LUIS) service
Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to a user’s conversational, natural language text to predict overall meaning, and pull out relevant, detailed information.
Box 2: Text Analytics API –
The Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text, and includes four main functions: sentiment analysis, key phrase extraction, named entity recognition, and language detection.
Box 3: The QnA Maker service –
QnA Maker is a cloud-based Natural Language Processing (NLP) service that easily creates a natural conversational layer over your data. It can be used to find the most appropriate answer for any given natural language input, from your custom knowledge base (KB) of information.
Incorrect Answers:
Dispatch tool library:
If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input. The dispatch tool does this by creating a single LUIS app to route user input to the correct model.
Reference:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-tutorial-dispatch
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/overview

Question 10

You are implementing the Language Understanding (LUIS) API and are building a GDPR-compliant bot by using the Bot Framework.
You need to recommend a solution to ensure that the implementation of LUIS is GDPR-compliant.
What should you include in the recommendation?

A. Enable active learning for the bot.

B. Configure the bot to send the active learning preference of a user.

C. Delete the utterances from Review endpoint utterances.

 


Suggested Answer: C

Deleting personal data from the device or service and can be used to support your obligations under the GDPR.
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/luis/luis-user-privacy

Question 11

HOTSPOT -
Your company is building a cinema chatbot by using the Bot Framework and Language Understanding (LUIS).
You are designing of the intents and the entities for LUIS.
The following are utterances that customers might provide:
✑ Which movies are playing on December 8?
✑ What time is the performance of Movie1?
✑ I would like to purchase two adult tickets in the balcony section for Movie2.
You need to identify which entity types to use. The solution must minimize development effort.
Which entry type should you use for each entity? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Prebuilt –
Datetime is prebuilt.
Language Understanding (LUIS) provides prebuilt entities. When a prebuilt entity is included in your application, LUIS includes the corresponding entity prediction in the endpoint response.
Box 2: Simple –
Box 3: Composite –
A composite entity is made up of other entities, such as prebuilt entities, simple, regular expression, and list entities. The separate entities form a whole entity.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-prebuilt-entities
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/reference-entity-composite

Question 12

You are developing an app for a conference provider. The app will use speech-to-text to provide transcription at a conference in English. It will also use the
Translator Text API to translate the transcripts to the language preferred by the conference attendees.
You test the translation features on the app and discover that the translations are fairly poor.
You want to improve the quality of the translations.
Which of the following actions should you take?

A. Use Text Analytics to perform the translations.

B. Use the Language Understanding (LUIS) API to perform the translations.

C. Perform the translations by training a custom model using Custom Translator.

D. Use the Computer Vision API to perform the translations.

 


Suggested Answer: C

Custom Translator is a feature of the Microsoft Translator service. With Custom Translator, enterprises, app developers, and language service providers can build neural translation systems that understand the terminology used in their own business and industry. The customized translation system will then seamlessly integrate into existing applications, workflows and websites.
Custom Translator allows users to customize Microsoft Translator’s advanced neural machine translation for Translator’s supported neural translation languages.
Custom Translator can be used for customizing text when using the Microsoft Translator Text API, and speech translation using the Microsoft Speech services.
Reference:
https://www.microsoft.com/en-us/translator/business/customization/

Question 13

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an app named App1 that uses the Face API.
App1 contains several PersonGroup objects.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional entries. The PersonGroup object for Ben Smith contains
10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The solution must ensure that Ben Smith can be identified by all the entries.
Solution: You modify the custom time interval for the training phase of App1.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead, use a LargePersonGroup. LargePersonGroup and LargeFaceList are collectively referred to as large-scale operations. LargePersonGroup can contain up to 1 million persons, each with a maximum of 248 faces. LargeFaceList can contain up to 1 million faces. The large-scale operations are similar to the conventional PersonGroup and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-scale

Question 14

You are using Azure Cognitive Services to create an interactive AI application that will be deployed for a world-wide audience.
You want the app to support multiple languages, including English, French, Spanish, Portuguese, and German.
Which of the following actions should you take?

A. Make use of Text Analytics.

B. Make use of Content Moderator.

C. Make use of QnA Maker.

D. Make use of Language API.

 


Suggested Answer: A

The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection

Question 15

You are designing a solution that will integrate the Bing Web Search API and will return a JSON response. The development team at your company uses C# as its primary development language.
You provide developers with the Bing endpoint.
Which additional component do the developers need to prepare and to retrieve data by using an API call?

A. the subscription ID

B. the API key

C. a query

D. the resource group ID

 


Suggested Answer: C

The Bing Web Search SDK makes it easy to integrate Bing Web Search into your C# application. You instantiate a client, send a request, and receive a response.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/bing-web-search/web-search-sdk-quickstart

Question 16

You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB.
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso, Ltd.
You discover that queries for the Contoso data are slow to complete, and the queries slow the entire application.
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize costs.
What should you do? More than one answer choice may achieve the goal. (Choose two.)

A. Change the request units.

B. Change the partitioning strategy.

C. Change the transaction isolation level.

D. Migrate the data to the Cosmos DB database.

 


Suggested Answer: AB

Increasing request units would improve throughput, but at a cost.
Throughput provisioned for a container is divided evenly among physical partitions.
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

Question 17

You have an Azure Machine Learning experiment.
You need to validate that the experiment meets GDPR regulation requirements and stores documentation about the experiment.
What should you use?

A. Compliance Manager

B. an Azure Log Analytics workspace

C. Azure Table storage

D. Azure Security Center

 


Suggested Answer: A

Compliance Manager for Azure helps you assess and manage GDPR compliance. Compliance Manager is a free, Microsoft cloud services solution designed to help organizations meet complex compliance obligations, including the GDPR, ISO 27001, ISO 27018, and NIST 800-53. Generally available today for Azure customers, the Compliance Manager GDPR dashboard enables you to assign, track, and record your GDPR compliance activities so you can collaborate across teams and manage your documents for creating audit reports more easily.
References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/

Question 18

DRAG DROP -
You need to design the workflow for an Azure Machine Learning solution. The solution must meet the following requirements:
✑ Retrieve data from file shares, Microsoft SQL Server databases, and Oracle databases that in an on-premises network.
✑ Use an Apache Spark job to process data stored in an Azure SQL Data Warehouse database.
Which service should you use to meet each requirement? To answer, drag the appropriate services to the correct requirements. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio/use-data-from-an-on-premises-sql-server
https://docs.microsoft.com/en-in/azure/azure-databricks/what-is-azure-databricks

Question 19

You are designing an AI solution that will provide feedback to teachers who train students over the Internet. The students will be in classrooms located in remote areas. The solution will capture video and audio data of the students in the classrooms.
You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements:
✑ Alert teachers if a student facial expression indicates the student is angry or scared.
✑ Identify each student in the classrooms for attendance purposes.
✑ Allow the teachers to log voice conversations as text.
Which Cognitive Services should you recommend?

A. Face API and Text Analytics

B. Computer Vision and Text Analytics

C. QnA Maker and Computer Vision

D. Speech to Text and Face API

 


Suggested Answer: D

Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input.
Face detection: Detect one or more human faces in an image and get back face rectangles for where in the image the faces are, along with face attributes which contain machine learning-based predictions of facial features. The face attribute features available are: Age, Emotion, Gender, Pose, Smile, and Facial Hair along with 27 landmarks for each face in the image.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
https://azure.microsoft.com/en-us/services/cognitive-services/face/

Question 20

You are designing an AI system for your company. Your system will consume several Apache Kafka data streams.
You want your system to be able to process the data streams at scale and in real-time.
Which of the following actions should you take?

A. Make use of Azure HDInsight with Apache HBase

B. Make use of Azure HDInsight with Apache Spark

C. Make use of Azure HDInsight with Apache Storm

D. Make use of Azure HDInsight with Microsoft Machine Learning Server

 


Suggested Answer: C

Apache Storm is a distributed, fault-tolerant, open-source computation system. You can use Storm to process streams of data in real time with Apache Hadoop.
Storm solutions can also provide guaranteed processing of data, with the ability to replay data that wasn’t successfully processed the first time.
Reference:
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-streaming-at-scale-overview
https://docs.microsoft.com/en-us/azure/hdinsight/storm/apache-storm-overview

Question 21

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
You need to monitor the scoring accuracy of each run of the model.
Solution: You configure Azure Monitor for containers.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

 

Question 22

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy Azure Stream Analytics as an IoT Edge module.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection

Question 23

Your company has a data team of Transact-SQL experts.
You plan to ingest data from multiple sources into Azure Event Hubs.
You need to recommend which technology the data team should use to move and query data from Event Hubs to Azure Storage. The solution must leverage the data team's existing skills.
What is the best recommendation to achieve the goal? More than one answer choice may achieve the goal.

A. Azure Notification Hubs

B. Azure Event Grid

C. Apache Kafka streams

D. Azure Stream Analytics

 


Suggested Answer: B

Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB.
You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an event grid.
Example:
Reference Image
First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files.
Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination.
Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The
Function then migrates data from the blob to a SQL data warehouse.
References: alt=”Reference Image” />
First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files.
Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination.
Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The
Function then migrates data from the blob to a SQL data warehouse.
References:
https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse

Question 24

You have a Bing Search service that is used to query a product catalog.
You need to identify the following information:
✑ The locale of the query
✑ The top 50 query strings
✑ The number of calls to the service
✑ The top geographical regions of the service
What should you implement?

A. Bing Statistics

B. Azure API Management (APIM)

C. Azure Monitor

D. Azure Application Insights

 


Suggested Answer: A

The Bing Statistics add-in provides metrics such as call volume, top queries, API response, code distribution, and market distribution. The rich slicing-and-dicing capability lets you gather deeper understanding of your users and their usage to inform your business strategy.
References:
https://www.bingapistatistics.com/

Question 25

You are developing an app that will perform the following tasks:
Evaluate trends in blue chip stock prices over the past decade.
Identify unusual fluctuations in stock prices.
Produce visual representations of the data.
You need to determine which Azure Cognitive Services APIs would be suitable for the app.
Which of the following actions should you take?

A. Make use of the Anomaly Detector API

B. Make use of the Computer Vision API

C. Make use of the QnA Maker API

D. Make use of the Custom Vision API

 


Suggested Answer: A

The Anomaly Detector API enables you to monitor and detect abnormalities in your time series data with machine learning. The Anomaly Detector API adapts by automatically identifying and applying the best-fitting models to your data, regardless of industry, scenario, or data volume. Using your time series data, the API determines boundaries for anomaly detection, expected values, and which data points are anomalies.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview

Question 26

Your company plans to create a mobile app that will be used by employees to query the employee handbook.
You need to ensure that the employees can query the handbook by typing or by using speech.
Which core component should you use for the app?

A. Language Understanding (LUIS)

B. QnA Maker

C. Text Analytics

D. Azure Search

 


Suggested Answer: D

Azure Cognitive Search (formerly known as “Azure Search”) is a search-as-a-service cloud solution that gives developers APIs and tools for adding a rich search experience over private, heterogeneous content in web, mobile, and enterprise applications. Your code or a tool invokes data ingestion (indexing) to create and load an index. Optionally, you can add cognitive skills to apply AI processes during indexing. Doing so can add new information and structures useful for search and other scenarios.
Incorrect Answres:
B: QnA Maker is a cloud-based API service that lets you create a conversational question-and-answer layer over your existing data. Use it to build a knowledge base by extracting questions and answers from your semi-structured content, including FAQs, manuals, and documents. Answer users’ questions with the best answers from the QnAs in your knowledge baseג€”automatically.
References:
https://docs.microsoft.com/en-us/azure/search/search-what-is-azure-search

Question 27

You have Azure IoT Edge devices that generate measurement data from temperature sensors. The data changes very slowly.
You need to analyze the data in a temporal two-minute window. If the temperature rises five degrees above a limit, an alert must be raised. The solution must minimize the development of custom code.
What should you use?

A. A Machine Learning model as a web service

B. an Azure Machine Learning model as an IoT Edge module

C. Azure Stream Analytics as an IoT Edge module

D. Azure Functions as an IoT Edge module

 


Suggested Answer: C

References:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-stream-analytics

Question 28

You are developing an AI application for your company. The application that uses batch processing to analyze data in JSON and PDF documents.
You want to store the JSON and PDF documents in Azure. You want to ensure data persistence while keeping costs at a minimum.
Which of the following actions should you take?

A. Make use of Azure Blob storage

B. Make use of Azure Cosmos DB

C. Make use of Azure Databricks

D. Make use of Azure Table storage

 


Suggested Answer: A

The following technologies are recommended choices for batch processing solutions in Azure.
Azure Storage Blob Containers. Many existing Azure business processes already use Azure blob storage, making this a good choice for a big data store.
Azure Data Lake Store. Azure Data Lake Store offers virtually unlimited storage for any size of file, and extensive security options, making it a good choice for extremely large-scale big data solutions that require a centralized store for data in heterogeneous formats.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/batch-processing
https://docs.microsoft.com/bs-latn-ba/azure/storage/blobs/storage-blobs-introduction

Question 29

Your company plans to monitor twitter hashtags, and then to build a graph of connected people and places that contains the associated sentiment.
The monitored hashtags use several languages, but the graph will be displayed in English.
You need to recommend the required Azure Cognitive Services endpoints for the planned graph.
Which Cognitive Services endpoints should you recommend?

A. Language Detection, Content Moderator, and Key Phrase Extraction

B. Translator Text, Content Moderator, and Key Phrase Extraction

C. Language Detection, Sentiment Analysis, and Key Phase Extraction

D. Translator Text, Sentiment Analysis, and Named Entity Recognition

 


Suggested Answer: C

Sentiment analysis, which is also called opinion mining, uses social media analytics tools to determine attitudes toward a product or idea.
Translator Text: Translate text in real time across more than 60 languages, powered by the latest innovations in machine translation.
The Key Phrase Extraction skill evaluates unstructured text, and for each record, returns a list of key phrases. This skill uses the machine learning models provided by Text Analytics in Cognitive Services.
This capability is useful if you need to quickly identify the main talking points in the record. For example, given input text “The food was delicious and there were wonderful staff”, the service returns “food” and “wonderful staff”.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-entity-linking
https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-keyphrases

Question 30

Your company has an on-premises datacenter.
You plan to publish an app that will recognize a set of individuals by using the Face API. The model is trained.
You need to ensure that all images are processed in the on-premises datacenter.
What should you deploy to host the Face API?

A. a Docker container

B. Azure File Sync

C. Azure Application Gateway

D. Azure Data Box Edge

 


Suggested Answer: A

A container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings.
Incorrect Answers:
D: Azure Data Box Edge is an AI-enabled edge computing device with network data transfer capabilities. This article provides you an overview of the Data Box
Edge solution, benefits, key capabilities, and the scenarios where you can deploy this device.
Data Box Edge is a Hardware-as-a-service solution. Microsoft ships you a cloud-managed device with a built-in Field Programmable Gate Array (FPGA) that enables accelerated AI-inferencing and has all the capabilities of a storage gateway.
References:
https://www.docker.com/resources/what-container

Question 31

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy Azure Functions as an IoT Edge module.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection

Question 32

You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop cluster.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Which storage solution should you recommend?

A. Azure Table storage

B. Azure File Storage

C. Azure Data Lake Storage Gen2

D. Azure Data Lake Storage Gen1

 


Suggested Answer: D

Azure Data Lake Storage Gen1 is adequate and less expensive compared to Gen2.
References:
https://visualbi.com/blogs/microsoft/introduction-azure-data-lake-gen2/

Question 33

You are designing an AI solution that will analyze millions of pictures.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Which storage solution should you recommend?

A. an Azure Data Lake store

B. Azure File Storage

C. Azure Blob storage

D. Azure Table storage

 


Suggested Answer: C

Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has more options for pricing depending upon things like how frequently you need to access your data (cold vs hot storage).
Reference:
http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing

Question 34

You design an AI workflow that combines data from multiple data sources for analysis. The data sources are composed of:
✑ JSON files uploaded to an Azure Storage account
✑ On-premises Oracle databases
✑ Azure SQL databases
Which service should you use to ingest the data?

A. Azure Data Factory

B. Azure SQL Data Warehouse

C. Azure Data Lake Storage

D. Azure Databricks

 


Suggested Answer: A

References:
https://docs.microsoft.com/en-us/azure/data-factory/introduction

Question 35

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to create an IoT solution that performs the following tasks:
✑ Identifies hazards
✑ Provides a real-time online dashboard
✑ Takes images of an area every minute
✑ Counts the number of people in an area every minute
Solution: You configure the IoT devices to send the images to an Azure IoT hub, and then you configure an Azure Automation call to Azure Cognitive Services that sends the results to an Azure event hub. You configure Microsoft Power BI to connect to the event hub by using Azure Stream Analytics.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead use Cognitive Services containers on the IoT devices.
References:
https://azure.microsoft.com/es-es/blog/running-cognitive-services-on-iot-edge/
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-live-data-visualization-in-power-bi

Question 36

You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

A. an Azure event hub for Stream1 and Azure Blob storage for Stream2

B. Azure Blob storage for Stream1 and Stream2

C. an Azure event hub for Stream1 and Stream2

D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2

E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2

 


Suggested Answer: AB

Stream1 – Azure Event –
Stream2 – Blob Storage –
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.
Stream1, Stream2 – Blob Storage –
Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources:
Azure Event Hubs –
Azure IoT Hub –
Azure Blob storage –
These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

Question 37

You are designing an AI workflow that performs data analysis from multiple data sources. The data sources consist of JSON files that have been uploaded to an
Azure Storage account, on-premises Oracle databases, and Azure SQL databases.
Which service should you recommend to ingest the data?

A. Azure Data Factory

B. Azure Kubernetes Service (AKS)

C. Azure Bot Service

D. Azure Databricks

 


Suggested Answer: A

Reference:
https://docs.microsoft.com/en-us/azure/data-factory/introduction

Question 38

You are configuring data persistence for a Microsoft Bot Framework application. The application requires a structured NoSQL cloud data store.
You need to identify a storage solution for the application. The solution must minimize costs.
What should you identify?

A. Azure Blob storage

B. Azure Cosmos DB

C. Azure HDInsight

D. Azure Table storage

 


Suggested Answer: D

Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets
You can develop applications on Cosmos DB using popular NoSQL APIs.
Both services have a different scenario and pricing model.
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no failover), indexing by PK/RK and storage- optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of each attribute/property and a pricing model focused on throughput.
References:
https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage

Question 39

HOTSPOT -
You have an app that uses the Language Understanding (LUIS) API as shown in the following exhibit.
 Image
Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: train –
Utterances are input from the user that your app needs to interpret. To train LUIS to extract intents and entities from them, it’s important to capture a variety of different example utterances for each intent. Active learning, or the process of continuing to train on new utterances, is essential to machine-learned intelligence that LUIS provides.
Box 2: creating intents –
Each intent needs to have example utterances, at least 15. If you have an intent that does not have any example utterances, you will not be able to train LUIS. If you have an intent with one or very few example utterances, LUIS will not accurately predict the intent.
Box 3: never published –
In each iteration of the model, do not add a large quantity of utterances. Add utterances in quantities of 15. Train, publish, and test again.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-utteran3ce

Question 40

DRAG DROP -
You plan to use the Microsoft Bot Framework to develop bots that will be deployed by using the Azure Bot Service.
You need to configure the Azure Bot Service to support the following types of bots:
✑ Bots that use Azure Functions
✑ Bots that set a timer-based
Which template should you use for each bot type? To answer drag the appropriate templates to the correct bot type. Each template may be used once, more than once or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-concept-templates?view=azure-bot-service-3.0

Question 41

You are developing an app that will analyze sensitive data from global users.
Your app must adhere the following compliance policies:
The app must not store data in the cloud.
The app not use services in the cloud to process the data.
Which of the following actions should you take?

A. Make use of Azure Machine Learning Studio

B. Make use of Docker containers for the Text Analytics

C. Make use of a Text Analytics container deployed to Azure Kubernetes Service

D. Make use of Microsoft Machine Learning (MML) for Apache Spark

 


Suggested Answer: D

The Microsoft Machine Learning Library for Apache Spark (MMLSpark) assists in provisioning scalable machine learning models for large datasets, especially for building deep learning problems. MMLSpark works with SparkML pipelines, including Microsoft CNTK and the OpenCV library, which provide end-to-end support for the ingress and processing of image input data, categorization of images, and text analytics using pre-trained deep learning algorithms.
Reference:
https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781789131956/10/ch10lvl1sec61/an-overview-of-the-microsoft-machine-learning-
library-for-apache-spark-mmlspark

Question 42

Your company is developing an AI solution that will identify inappropriate text in multiple languages.
You need to implement a Cognitive Services API that meets this requirement.
You use Language Understanding (LUIS) to identify inappropriate text.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

Language Understanding (LUIS) is designed to identify valuable information in conversations, LUIS interprets user goals (intents) and distills valuable information from sentences (entities), for a high quality, nuanced language model. LUIS integrates seamlessly with the Azure Bot Service, making it easy to create a sophisticated bot.
Use Content Moderation instead.
Reference:
https://www.luis.ai/home

https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/content-moderator/overview

Question 43

You company's developers have created an Azure Data Factory pipeline that moves data from an on-premises server to Azure Storage. The pipeline consumes
Azure Cognitive Services APIs.
You need to deploy the pipeline. Your solution must minimize custom code.
You use Integration Runtime to move data to the cloud and Azure API Management to consume Cognitive Services APIs.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

Azure API Management is a turnkey solution for publishing APIs to external and internal customers.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios

Question 44

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy an Azure Machine Learning model as an IoT Edge module.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT Edge devices. For example, you can deploy an Azure
Machine Learning module that predicts when a device fails based on simulated machine temperature data.
References:
https://docs.microsoft.com/bs-latn-ba/azure/iot-edge/tutorial-deploy-machine-learning

Question 45

You have deployed 1,000 sensors for an AI application that you are developing. The sensors generate large amounts data that is ingested on an hourly basis.
You want your application to analyze the data generated by the sensors in real-time.
Which of the following actions should you take?

A. Make use of Azure Kubernetes Service (AKS)

B. Make use of Azure Cosmos DB

C. Make use of an Azure HDInsight Hadoop cluster

D. Make use of Azure Data Factory

 


Suggested Answer: C

Azure HDInsight makes it easy, fast, and cost-effective to process massive amounts of data.
You can use HDInsight to process streaming data that’s received in real time from a variety of devices.
Reference:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction

Question 46

You design an AI solution that uses an Azure Stream Analytics job to process data from an Azure IoT hub. The IoT hub receives time series data from thousands of IoT devices at a factory.
The job outputs millions of messages per second. Different applications consume the messages as they are available. The messages must be purged.
You need to choose an output type for the job.
What is the best output type to achieve the goal? More than one answer choice may achieve the goal.

A. Azure Event Hubs

B. Azure SQL Database

C. Azure Blob storage

D. Azure Cosmos DB

 


Suggested Answer: D

Stream Analytics can target Azure Cosmos DB for JSON output, enabling data archiving and low-latency queries on unstructured JSON data.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output

Question 47

You plan to build an application that will perform predictive analytics. Users will be able to consume the application data by using Microsoft Power BI or a custom website.
You need to ensure that you can audit application usage.
Which auditing solution should you use?

A. Azure Storage Analytics

B. Azure Application Insights

C. Azure diagnostics logs

D. Azure Active Directory (Azure AD) reporting

 


Suggested Answer: D

References:
https://docs.microsoft.com/en-us/azure/active-directory/reports-monitoring/concept-audit-logs

Question 48

Your company's marketing department is creating a social media campaign that will allow users to submit video messages for the company's social media sites.
You are developing an AI app for the campaign. Your app must meet the following requirements:
Add captions to the video messages before they are posted to the social media sites.
Ensure that no negative video messages are posted to the social media sites.
Which of the following actions should you take?

A. Implement Form Recognizer in your app.

B. Implement the Face API in your app.

C. Implement Custom Vision in your app.

D. Implement Video Indexer in your app.

 


Suggested Answer: D

Video Indexer includes Audio transcription: Converts speech to text in 12 languages and allows extensions. Supported languages include English, Spanish,
French, German, Italian, Mandarin Chinese, Japanese, Arabic, Russian, Portuguese, Hindi, and Korean.
When indexing by one channel, partial result for those models will be available, such as sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text.
Reference:
https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview

Question 49

You plan to deploy an AI solution that tracks the behavior of 10 custom mobile apps. Each mobile app has several thousand users.
You need to recommend a solution for real-time data ingestion for the data originating from the mobile app users.
Which Microsoft Azure service should you include in the recommendation?

A. Azure Event Hubs

B. Azure Service Bus queries

C. Azure Service Bus topics and subscriptions

D. Apache Storm on Azure HDInsight

 


Suggested Answer: A

References:
https://docs.microsoft.com/en-in/azure/event-hubs/event-hubs-about

Question 50

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You add an SSH key to the node, and then you create an SSH connection.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes.
You also need to create an SSH connection to the AKS node.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

Access Full AI-100 Dump Free

Looking for even more practice questions? Click here to access the complete AI-100 Dump Free collection, offering hundreds of questions across all exam objectives.

We regularly update our content to ensure accuracy and relevance—so be sure to check back for new material.

Begin your certification journey today with our AI-100 dump free questions — and get one step closer to exam success!

Share18Tweet11
Previous Post

ADM-201 Dump Free

Next Post

AI-102 Dump Free

Next Post

AI-102 Dump Free

AI-900 Dump Free

ANS-C00 Dump Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.