100% Real Microsoft 70-243 Exam Questions & Answers, Accurate & Verified By IT Experts
Instant Download, Free Fast Updates, 99.6% Pass Rate
Microsoft 70-243 Practice Test Questions in VCE Format
File | Votes | Size | Date |
---|---|---|---|
File Microsoft.braindumps2go.70-243.v2015-08-03.by.JohnTheMan.103q.vce |
Votes 12 |
Size 3.11 MB |
Date Aug 11, 2015 |
File Microsoft.Actualtests.70-243.vv2014-11-13.by.Alvin.79q.vce |
Votes 17 |
Size 4.23 MB |
Date Nov 13, 2014 |
File Microsoft.Exactquestions.70-243.v2014-09-30.by.KATHERINA.80q.vce |
Votes 5 |
Size 4.21 MB |
Date Sep 30, 2014 |
File Microsoft.Passguide.70-243.v2013-06-24.by.Rajeev.80q.vce |
Votes 427 |
Size 4.18 MB |
Date Jun 25, 2013 |
Archived VCE files
Microsoft 70-243 Practice Test Questions, Exam Dumps
Microsoft 70-243 (Administering and Deploying System Center 2012 Configuration Manager (SCCM)) exam dumps vce, practice test questions, study guide & video training course to study and pass quickly and easily. Microsoft 70-243 Administering and Deploying System Center 2012 Configuration Manager (SCCM) exam dumps & practice test questions and answers. You need avanset vce exam simulator in order to study the Microsoft 70-243 certification exam dumps & Microsoft 70-243 practice test questions in vce format.
The 70-243 Exam, formally known as Administering and Deploying System Center 2012 Configuration Manager, was a benchmark certification for IT professionals. It validated the skills required to manage devices and users within an enterprise environment. While this specific exam is retired, the underlying principles and technologies have evolved into what is now Microsoft Endpoint Configuration Manager. Understanding the concepts from the 70-243 Exam provides a powerful foundation for modern endpoint management, making this knowledge incredibly relevant for anyone working with large-scale device administration today. This series will explore the core competencies once tested by this exam.
The primary purpose of Configuration Manager is to provide a unified management console for a wide array of administrative tasks. This includes deploying operating systems, distributing software applications, managing software updates, and monitoring the health and compliance of devices across a network. For the 70-243 Exam, a deep understanding of this centralized control philosophy was essential. The platform allows administrators to automate repetitive tasks, enforce corporate policies, and maintain a secure and efficient IT infrastructure. This capability reduces manual effort, minimizes errors, and ensures that all managed devices adhere to organizational standards.
A significant focus of the curriculum for the 70-243 Exam was on increasing IT efficiency and productivity. By automating software deployments and updates, administrators could ensure that thousands of machines received necessary patches and applications without manual intervention on each device. This not only saves an immense amount of time but also improves the security posture of the organization by closing vulnerabilities promptly. Furthermore, its detailed inventory and reporting features give organizations clear visibility into their hardware and software assets, which is crucial for license compliance, budgeting, and future planning. These benefits remain central to its modern successor.
The knowledge required for the 70-243 Exam encompassed a broad spectrum of skills, from initial infrastructure design and deployment to the daily operational tasks of managing clients and resources. It required candidates to think strategically about how to structure the environment for optimal performance and scalability, whether for a single office or a global enterprise with numerous locations. This included planning site hierarchies, configuring network boundaries, and establishing roles for different administrative functions. This strategic planning is a skill that transcends the specific software version and is a hallmark of a senior administrator.
Finally, while the 70-243 Exam was specific to the 2012 version of System Center Configuration Manager, the evolution to Microsoft Endpoint Configuration Manager has been an iterative one. Many of the core features and architectural concepts have been carried forward and enhanced. Therefore, studying the domains of the original exam is not an exercise in outdated technology but rather a deep dive into the foundational architecture that powers modern endpoint management. It equips professionals with the context and understanding to master current tools more effectively, recognizing the lineage of features like collections, deployments, and task sequences.
A fundamental component of the 70-243 Exam was the ability to design and plan a Configuration Manager hierarchy. This process begins with understanding the different types of sites and their roles. The top-level site is the Central Administration Site (CAS), which is used to manage and oversee multiple primary sites in a very large, geographically dispersed organization. Below the CAS are Primary Sites, which are where clients are directly assigned and managed. Lastly, Secondary Sites are used at remote locations with limited bandwidth to control the flow of network traffic back to a parent primary site.
Choosing the correct hierarchy is critical for performance and scalability. For most organizations, a standalone primary site is sufficient to manage all devices. A CAS is only necessary for very large-scale deployments, typically exceeding 100,000 clients, and adds complexity to the infrastructure. The 70-243 Exam required candidates to evaluate an organization's needs, including the number of clients, geographical distribution, and administrative model, to recommend the most appropriate design. A poorly designed hierarchy can lead to slow data replication, administrative bottlenecks, and difficulties in managing clients effectively.
Another key design element is the configuration of boundaries and boundary groups. A boundary represents a network location, such as an IP subnet, Active Directory site, or IP address range. Boundaries are then organized into boundary groups. These groups are used to control client content location and site assignment. For example, when a client needs to download an application, it will look for a distribution point that is associated with its current boundary group. Properly configured boundaries ensure that clients download content from the nearest server, which minimizes network traffic over slow WAN links.
The 70-243 Exam also covered the planning for site system roles. Configuration Manager is not a single application but a collection of services running on various servers, known as site system roles. Key roles include the Management Point, which is the primary point of communication for clients, and the Distribution Point, which stores the content for applications and software updates. Other important roles include the Software Update Point, which integrates with Windows Server Update Services (WSUS) to manage patches, and the Reporting Services Point for generating reports. Placing these roles on appropriate servers is crucial for load balancing and fault tolerance.
Finally, planning for security was a major topic. This includes defining administrative users and assigning them to security roles. Configuration Manager uses Role-Based Administration Control (RBAC) to grant permissions. This allows you to define who can perform certain actions, like creating applications, and what objects they can see, such as a specific collection of devices. For instance, you could create a role for a help desk team that only allows them to initiate remote control sessions for desktops in their assigned office. This granular control, a key topic in the 70-243 Exam, is essential for maintaining a secure and well-managed environment.
Once the design phase is complete, the next step tested in the 70-243 Exam was the actual deployment of the Configuration Manager sites. This involves preparing the prerequisite infrastructure, which includes setting up Windows Servers with the required roles and features, such as the .NET Framework and specific IIS configurations. It also requires a supported version of SQL Server to host the site database. The installation process itself is wizard-driven but requires careful input of details determined during the planning phase, such as site codes, site names, and the installation directory. A successful installation is the first step toward a healthy environment.
After the initial site installation, the focus shifts to configuring the core components. This includes setting up discovery methods. Discovery is the process by which Configuration Manager identifies resources, such as users, groups, and computers, from the network. Common methods include Active Directory Forest Discovery, which can discover site and subnet information, and Active Directory System Discovery, which finds computer resources. Configuring discovery correctly is vital because you cannot manage a resource that has not been discovered. The 70-243 Exam emphasized configuring these methods to run on a schedule to keep the resource information up to date.
Following discovery, the configuration of boundaries and boundary groups is a critical task. As planned in the design phase, you would create boundaries based on your network segments, such as IP subnets or Active Directory sites. These boundaries are then added to boundary groups. Within each boundary group, you must configure the relationships, assigning specific site systems like management points and distribution points to serve clients within those network locations. This ensures that clients communicate with the correct servers and download content from the closest source, optimizing network traffic and client performance.
The deployment of site system roles is another practical skill covered in the 70-243 Exam. Using the Configuration Manager console, an administrator adds roles to designated servers. For example, to enable software distribution, you would install the Distribution Point role on one or more servers. To manage software updates, you would install the Software Update Point role. Each role has its own set of prerequisites and configuration options that must be managed. Proper placement and configuration of these roles are essential for the functionality of the entire system, ensuring services like application deployment and patching work as expected.
Finally, configuring intersite data replication is essential in a multi-site hierarchy. If you have a Central Administration Site and multiple primary sites, or primary sites with secondary sites, you need to manage how data is transferred between them. Configuration Manager uses a mix of file-based and database replication to keep all sites synchronized. While much of this is configured automatically, an administrator must monitor the replication links for health and troubleshoot any issues that arise. The 70-243 Exam required an understanding of these replication mechanisms to ensure a distributed environment remains consistent and functional.
A core function of Configuration Manager, and a major topic of the 70-243 Exam, is the deployment and management of the client agent. To manage a device, the Configuration Manager client software must be installed on it. There are several methods for deploying the client. Client Push Installation is a popular method where the site server automatically pushes the client to discovered computers. Other methods include using a Group Policy Object (GPO) in Active Directory, manual installation by a technician, or including the client as part of an operating system deployment task sequence.
Once the client is installed, it must be monitored to ensure it remains healthy. The 70-243 Exam covered the tools and techniques for monitoring client health. The console provides a dashboard that shows the status of clients, identifying those that are inactive or have failed their health checks. Configuration Manager can also be configured to automatically remediate common client issues, such as a stopped service. Maintaining a high percentage of healthy, active clients is crucial for the effectiveness of all other functions, from software deployment to security compliance reporting. Without a healthy client, the device cannot be managed.
Organizing discovered resources is another essential administrative task. This is accomplished through the use of collections. A collection is a grouping of users or devices. Collections can be created based on a wide range of criteria using queries. For example, you could create a collection of all computers with less than 8GB of RAM, or a collection of all users in the Finance department. These collections are the targets for deployments. When you deploy an application or a software update, you deploy it to a collection, not to individual devices. Mastery of the query language (WQL) was a key skill for the 70-243 Exam.
Power management is an often-overlooked but important capability. Configuration Manager allows administrators to apply power plans to collections of computers. This can help an organization reduce energy costs by ensuring that computers are put into a low-power state or shut down when not in use. You can define peak and non-peak business hours and apply different power settings for each. For the 70-243 Exam, candidates were expected to know how to configure these settings and report on the power savings achieved, demonstrating the platform's ability to contribute to business goals beyond just IT management.
Remote Control is a feature that allows administrators to connect to and take control of a client computer's desktop. This is invaluable for troubleshooting user issues and providing remote support. The 70-243 Exam covered the configuration of Remote Control settings, including security and privacy options. For example, you can require the user's permission before starting a remote session, and you can configure which administrators or support staff are permitted to initiate these connections. This provides a secure and integrated tool for help desk personnel, eliminating the need for separate remote assistance software.
A foundational capability tested in the 70-243 Exam is hardware and software inventory. Configuration Manager can collect detailed information about the hardware components of every managed device. This includes data about the CPU, memory, disk space, network adapters, and much more. This information is collected by the client agent on a configurable schedule and sent to the site server, where it is stored in the database. Administrators can then use this data to run reports, create collections based on specific hardware attributes, and plan for hardware upgrades.
Software inventory provides a detailed catalog of the software installed on client computers. The client agent can scan for installed applications by looking at registry information or by scanning for specific file headers. This allows administrators to track software installations across the enterprise. The data gathered is essential for managing software license compliance, identifying unauthorized software installations, and planning for application upgrades or retirements. The 70-243 Exam required a thorough understanding of how to configure the scope and schedule of these inventory cycles to balance data freshness with network and client performance impact.
Beyond simple inventory, Asset Intelligence adds another layer of data classification. Asset Intelligence takes the raw software inventory data and normalizes it against a Microsoft-maintained catalog. This catalog contains information about millions of software titles, categorizing them by product name, vendor, version, and category (e.g., "Web Browser" or "Productivity"). This process cleans up inconsistent software titles and provides a much clearer picture of the software landscape. For example, it can group various update and patch versions under a single parent product, simplifying reporting.
Using Asset Intelligence, administrators can gain deeper insights. They can easily report on the number of deployed licenses for major software products like Microsoft Office or Adobe Acrobat. It also helps in tracking license usage. The 70-243 Exam curriculum included topics on how to use Asset Intelligence reports to reconcile purchased software licenses against installed software. This is a critical function for internal audits and ensuring the organization is not over-licensed, which wastes money, or under-licensed, which carries legal and financial risks.
Finally, both hardware and software inventory data are the backbone of the query and reporting system. The ability to create custom reports was a key skill for the 70-243 Exam. An administrator might be asked to generate a report of all machines that are ready for a Windows upgrade, based on criteria like available disk space and memory. Or they might need to create a collection of all devices running an outdated version of a specific application so that it can be targeted for an upgrade. This ability to query the vast amount of collected data transforms Configuration Manager from a deployment tool into a powerful strategic asset for IT planning.
The model for application management in System Center 2012 Configuration Manager was a significant evolution and a central topic of the 70-243 Exam. The platform introduced a new, more intelligent application model while still supporting the traditional package model. The application model is state-based and focuses on user-centric delivery. Instead of just running a command line, it allows an administrator to define an application with multiple deployment types. For example, a single application can have a Windows Installer (MSI) deployment type for desktops and a different App-V package for virtual environments.
This new model is highly flexible. The system can automatically choose the best deployment type for a given device based on rules and requirements. An administrator can set rules such as requiring a minimum amount of RAM, a specific operating system, or even the absence of another conflicting application. This intelligence ensures that software is only installed on devices that meet the necessary criteria, reducing installation failures and support calls. The 70-243 Exam required a deep understanding of how to create and configure these requirement rules to ensure successful and targeted application delivery across a diverse enterprise environment.
Another key concept is detection methods. After an installation is attempted, Configuration Manager needs to know if it was successful. A detection method is a rule that verifies the presence of the application. This could be checking for the existence of a specific MSI product code, a file version, or a registry key. If the detection method rule is met, the application is reported as successfully installed. This state-based approach means the system constantly evaluates compliance. If a user uninstalls a required application, Configuration Manager will detect its absence at the next evaluation cycle and automatically reinstall it.
In contrast to the application model, the classic package and program model remains available for simpler deployments or for running scripts. A package is simply a collection of source files and a program defines a command line to be run on the client. It is not state-based and does not use detection methods or requirement rules. The 70-243 Exam tested the candidate's ability to know when to use each model. For instance, deploying a simple utility or a batch script might be better suited for the package model, while deploying a complex business application with various dependencies would be a job for the more robust application model.
User-centric application delivery was a major philosophical shift covered in the 70-243 Exam. Administrators can deploy applications to user collections, not just device collections. When an application is deployed to a user, Configuration Manager determines all the devices that the user logs into (their primary devices) and makes the software available there. This allows users to access their necessary tools on any of their designated machines. It also enables a self-service model through the Software Center, where users can browse a catalog of approved applications and install them on-demand without needing to contact the help desk.
The process of deploying an application using the model tested in the 70-243 Exam involves several distinct stages. The first stage is creating the application in the Configuration Manager console. This involves providing metadata such as the application name, version, and publisher. More importantly, this is where you create the deployment types. For a standard Windows application, you would typically select the Windows Installer (MSI) file. The console will automatically parse the MSI to populate information like the installation command line and the detection method based on the product code.
Once the application and its deployment types are defined, the next step is to distribute the content. The source files for the application, such as the MSI and any supporting files, must be copied to distribution points. Distribution points are the site system roles that act as file servers for clients. An administrator selects the application content and distributes it to a single distribution point or, more commonly, to a distribution point group. This ensures that the content is available in the necessary network locations so that clients can download it efficiently without pulling it across slow WAN links.
After the content is successfully distributed, the application can be deployed. The deployment process involves using a wizard to define the terms of the deployment. Here, the administrator selects the target collection, which can be a collection of users or devices. They also specify the action, which is either to install or uninstall the application. A critical choice is the purpose of the deployment: either "Required" or "Available". A required deployment means the application will be installed automatically on all clients in the collection according to the configured schedule. An available deployment makes the application visible in the Software Center for users to install voluntarily.
Scheduling is a powerful aspect of the deployment configuration. For required deployments, you can set an installation deadline. The application will be installed automatically at some point after it becomes available, but no later than the deadline. This allows for staggering installations to manage network bandwidth. For example, a new application might be made available for a week, and any machine that has not installed it by the end of the week will have it forcibly installed at the deadline. The 70-243 Exam required knowledge of these scheduling options to control the rollout of new software effectively.
Finally, monitoring the deployment is a crucial ongoing task for an administrator. The Configuration Manager console provides a dedicated monitoring workspace where you can track the status of all deployments. You can see statistics on success, failure, and in-progress installations. You can also drill down to see the status for individual devices and troubleshoot any errors that have occurred. This detailed feedback loop is essential for verifying the success of a software rollout and for quickly identifying and resolving any issues that prevent clients from receiving their required applications, a skill heavily emphasized in the 70-243 Exam curriculum.
Managing software updates is one of the most critical security functions of Configuration Manager and a significant domain of the 70-243 Exam. The process is handled by integrating with Windows Server Update Services (WSUS). The Software Update Point (SUP) site system role is installed on a server that has WSUS. Configuration Manager then takes control of the WSUS instance, using it as a catalog to synchronize software update metadata from Microsoft Update. Administrators use the Configuration Manager console, not the WSUS console, to manage the entire update process from that point forward.
The first step in the process is configuring the SUP properties. This is where you define which products and classifications of updates you want to synchronize. For example, you might choose to synchronize updates for Windows 10 and Microsoft Office, and you might select classifications like "Critical Updates" and "Security Updates." You also configure the synchronization schedule, which determines how often Configuration Manager checks with Microsoft for new updates. Proper configuration here is key to ensuring you get the updates you need without downloading metadata for products that are not used in your environment.
Once the update metadata is synchronized into the site database, administrators can search and filter through all the available updates. A common practice is to group relevant updates together into a Software Update Group. For example, all the security updates released by Microsoft on a particular Patch Tuesday could be added to a single group. This makes the updates much easier to manage and deploy. Instead of deploying hundreds of individual updates, you deploy the single Software Update Group to your target collections.
The deployment process for software updates is very similar to deploying an application. You use a wizard to deploy a Software Update Group to a collection. You set a deadline and choose whether to make it available or required. A key difference, covered in the 70-243 Exam, is the user experience and restart behavior. You can configure deployment settings to suppress system restarts on servers but allow them on workstations, or define specific maintenance windows during which all installations and restarts must occur. These maintenance windows are crucial for preventing unexpected reboots during business hours.
Automatic Deployment Rules (ADRs) provide a powerful way to automate the entire monthly patching process. An ADR can be configured to run automatically on a schedule, such as the day after Patch Tuesday. The rule can be defined to automatically search for updates based on criteria like product and severity, create a new Software Update Group, add the found updates to it, distribute the content to distribution points, and deploy the group to a target collection. Mastering ADRs, a key objective for the 70-243 Exam, allows administrators to create a highly automated and consistent patching process that requires minimal manual intervention each month.
System Center Endpoint Protection (SCEP), a feature managed by Configuration Manager, was an important security topic in the 70-243 Exam. SCEP provides a centralized solution for managing antimalware and firewall policies for client computers. It uses its own client agent, which is deployed and managed by the Configuration Manager client. This integration allows for a single pane of glass for all aspects of endpoint management, from application deployment to security policy enforcement. The administrator can manage everything from the familiar Configuration Manager console.
The process begins with deploying the Endpoint Protection client. This is typically done through client settings. You can enable the management of the Endpoint Protection agent in a custom client setting policy and deploy that policy to a collection of devices. When a device in that collection receives the policy, the Configuration Manager client will automatically install the SCEP agent and take over its management. This ensures that all designated devices are protected and prevents users from disabling or misconfiguring their antimalware software.
Antimalware policies are used to configure the behavior of the SCEP agent. Within a policy, an administrator can define settings for scheduled scans, real-time protection, items to exclude from scans, and what action to take when malware is detected (e.g., quarantine, remove, or allow). Different policies can be created for different types of devices. For example, servers might have a less aggressive scanning schedule and more specific folder exclusions to avoid impacting the performance of business applications. The 70-243 Exam required candidates to know how to create, deploy, and prioritize these policies.
Managing definition updates is another critical function. The SCEP clients need to have the latest malware definition files to be effective against new threats. Configuration Manager provides several ways to distribute these definitions. The most common method is to use the Software Updates feature. SCEP definitions are published to WSUS, and they can be synchronized and deployed using the same process as security patches, often with an Automatic Deployment Rule to ensure they are approved and distributed multiple times per day. This leverages the existing content distribution infrastructure for efficient delivery.
Finally, monitoring and reporting on the security status of the environment is essential. The Configuration Manager console includes dashboards and reports specifically for Endpoint Protection. Administrators can quickly see the overall malware activity in the organization, view which computers have outdated definitions, and identify machines where malware has been detected. This centralized visibility allows for rapid response to threats. The ability to configure alerts, for example, to notify an administrator via email when a malware outbreak is detected, was a key operational skill assessed in the 70-243 Exam.
Compliance Settings, formerly known as Desired Configuration Management, is a powerful feature for auditing and enforcing configuration standards, and it was a key topic for the 70-243 Exam. It allows administrators to define a "baseline" for how a device should be configured. This baseline can include a wide variety of settings, such as required registry key values, specific file versions, or whether a certain Windows feature is enabled. The system then evaluates clients against this baseline and reports on their compliance status.
The building blocks of compliance settings are Configuration Items (CIs). A CI is a specific setting that you want to check. For example, you could create a CI to verify that the Windows Remote Registry service is disabled on all client workstations. You can define CIs for a variety of sources, including the registry, WMI, file system, and scripts. The 70-243 Exam required the ability to create CIs for different data types to check for very specific conditions on a managed device. Each CI includes the setting to check and the rule that defines what the compliant state is.
Multiple Configuration Items are then grouped together into a Configuration Baseline. The baseline represents the complete desired state for a group of devices. For instance, a "Standard Workstation Security" baseline might include CIs that check for the state of the firewall, the presence of specific software, registry settings related to security, and password policy requirements. This baseline becomes the unit of assignment. You don't deploy individual CIs; you deploy the entire baseline to a collection of devices to evaluate them as a whole.
Once a Configuration Baseline is deployed to a collection, the clients in that collection will evaluate their compliance against all the CIs within it on a schedule. The results are then reported back to the site server. The administrator can view the compliance status in the console, seeing which devices are compliant, which are non-compliant, and which specific settings are out of compliance on each device. This provides detailed and actionable information for auditing purposes. You can quickly identify all machines that are drifting from the corporate standard.
Beyond simply reporting on non-compliance, Configuration Baselines can also be configured to automatically remediate certain issues. When creating a rule within a Configuration Item, if the setting supports it (like a registry value or a script), you can enable remediation. If a device is found to be non-compliant for that setting, the client agent will automatically take action to bring the setting back into compliance. For example, if the Remote Registry service is found to be enabled, the agent can automatically disable it. The 70-243 Exam tested this ability to create self-healing configurations to proactively maintain a secure and standardized environment.
Operating System Deployment, or OSD, is one of the most powerful and complex features within Configuration Manager, making it a critical area of study for the 70-243 Exam. OSD provides a framework for automating the installation of Windows operating systems on new computers (bare metal), refreshing existing computers with a new OS, or upgrading an existing OS. This capability is essential for managing the entire lifecycle of a device, from its initial setup to its eventual retirement. It enables organizations to create a standardized, repeatable process for building workstations and servers.
The core of OSD is the concept of a task sequence. A task sequence is a series of steps that are executed in order on a client computer. The Configuration Manager console provides a rich task sequence editor that allows administrators to build a complete deployment process from start to finish. A typical task sequence for a bare-metal build would include steps to partition the hard disk, apply an operating system image, install device drivers, install the Configuration Manager client, and deploy a standard set of applications. The 70-243 Exam required a thorough understanding of the available task sequence steps and how to logically order them.
To support OSD, several key infrastructure components must be in place. First, you need an operating system image, which is a WIM (Windows Imaging Format) file containing the OS you want to deploy. You also need boot images, which are lightweight Windows PE (Preinstallation Environment) images used to start a computer so it can communicate with the Configuration Manager environment and run the task sequence. Both of these images must be added to the console and distributed to distribution points. Device drivers for the various hardware models in your environment must also be imported and managed.
The deployment process is often initiated via a network boot using the Preboot Execution Environment (PXE). To enable this, an administrator must configure the PXE service point role on one or more distribution points. When a new computer is configured to boot from the network, it sends out a broadcast request. The PXE service point responds and provides the computer with the necessary boot image. Once the computer loads the Windows PE environment from the boot image, it can then execute the assigned task sequence to install the full operating system. Understanding the PXE boot process was a key technical requirement for the 70-243 Exam.
OSD is not just for new computers. It can also be used to refresh an existing machine, for example, when an employee receives a new computer or when a machine needs to be rebuilt due to a problem. In a refresh scenario, the task sequence can use the User State Migration Tool (USMT) to capture the user's files and settings, store them on a state migration point or locally, install the new operating system, and then restore the user's data. This ensures a seamless transition for the end-user, a process that was an important part of the 70-243 Exam curriculum.
A central component of any OSD solution is the operating system image. For the 70-243 Exam, candidates needed to understand how to prepare and manage these images. The standard approach is to use a WIM file. This process typically starts with building a reference computer. The reference machine is installed with the desired Windows operating system, updated with the latest patches, and configured with any standard settings or core software. After this machine is built to specification, a capture process is run to create the WIM image file from its hard drive.
This captured image is then imported into the Configuration Manager console. Once imported, it becomes an object that can be managed and deployed. It must be distributed to distribution points so that clients can access it during a task sequence. A key consideration is image servicing. Instead of rebuilding the reference computer every month to add new patches, Configuration Manager provides a feature for offline servicing. This allows an administrator to apply the latest Windows updates directly to the WIM image file stored in the console, which is a much more efficient way to keep the base image up to date.
Driver management is another significant challenge in enterprise environments with diverse hardware models. The 70-243 Exam covered the methods for handling device drivers within OSD. The first step is to obtain the necessary drivers from the hardware vendors and import them into the Configuration Manager console. The imported drivers are stored in a driver catalog and can be organized into folders for easier management. It is a best practice to organize drivers by hardware model and operating system version.
Once drivers are imported, they are typically grouped into driver packages. A driver package is a collection of drivers that is distributed to distribution points, just like other content. In the task sequence, there are steps available to install these drivers. The "Apply Driver Package" step allows you to install a specific set of drivers, which is useful if you create a dedicated task sequence for each hardware model. A more dynamic method is the "Auto Apply Drivers" step, which instructs the client to evaluate its hardware and then download and install only the specific drivers that match its detected devices from the catalog.
Boot images also require driver management. A boot image is a minimal version of Windows PE, and it needs to have network and storage drivers to function. If you introduce a new hardware model, its network or disk controller might not be recognized by the default boot image. In this case, you must inject the necessary 64-bit drivers directly into the boot image file. The 70-243 Exam required knowledge of this process, which involves adding the drivers to the boot image properties in the console and then updating the distribution points with the new version of the boot image.
The task sequence is the heart of the OSD process, and a deep understanding of its creation and management was essential for the 70-243 Exam. A task sequence is created using a wizard that provides several templates for common scenarios like installing a new operating system or upgrading one. After the initial creation, the task sequence can be extensively customized using the editor. The editor presents a list of steps, which are organized into groups. An administrator can add, remove, and reorder steps, and configure the properties for each one.
A standard task sequence for a new computer build contains several logical phases. It begins with steps to prepare the computer, such as restarting into the Windows PE environment. The next phase involves partitioning and formatting the local hard disk. Following that, the core steps apply the operating system image and the Windows settings, such as the product key and local administrator password. Then, it configures the network and installs the Configuration Manager client itself, which is a crucial step that allows the device to be managed after the build is complete.
After the base OS and the client are installed, the task sequence typically moves on to installing drivers. As discussed, this can be done using a specific driver package or by having the client dynamically detect and install the necessary drivers. The next phase is often application installation. You can add "Install Application" or "Install Package" steps to the task sequence to deploy a standard suite of software, ensuring that the computer is ready for the user immediately after the build process is finished. This level of automation is a key benefit of using OSD.
Task sequences also have robust logic and control flow options, a topic covered in the 70-243 Exam. You can add conditions to any step or group of steps. For example, you could have a group of steps for installing laptop-specific software, and set a condition so that this group only runs if the task sequence detects that the computer is a laptop (based on a WMI query). You can also set variables during the task sequence, which can be used to control later steps. This allows for the creation of highly dynamic and flexible task sequences that can handle multiple hardware models and deployment scenarios.
Once a task sequence is built, it must be deployed to a collection. When deploying, you specify how the task sequence should be made available. You can make it available to PXE boot requests, bootable media (like a USB drive or DVD), or to clients within the full operating system through the Software Center. The deployment is what links a specific task sequence to a group of computers, allowing them to execute it. Monitoring task sequence deployments is crucial for troubleshooting failures, as the progress and error codes for each step are logged and reported to the site server.
When performing a computer refresh or replacement, preserving the user's data and settings is paramount. The 70-243 Exam covered the integration of the User State Migration Tool (USMT) with OSD to automate this process. USMT is a command-line tool from Microsoft that captures user profiles, files, and operating system settings from a source computer and allows them to be restored to a new computer or a new operating system installation on the same computer. Configuration Manager provides built-in task sequence steps to manage the entire USMT process.
The process requires a State Migration Point (SMP) site system role. The SMP is a network share where the captured user state data is stored temporarily. In a refresh task sequence, the first set of steps will run the USMT capture process on the client computer. The USMT tools, along with configuration files that define what data to capture, are used to gather the user's state and save it to a secure, encrypted location on the SMP. The task sequence then proceeds to wipe the hard drive and install the new operating system.
The capture process is controlled by XML files. USMT comes with default XML files that specify the capture of standard user profile data, such as desktop settings, documents, and common application settings. However, organizations can create custom XML files to extend the migration. For example, you could write a custom XML file to capture specific line-of-business application settings or to migrate files with specific extensions from non-standard locations. The 70-243 Exam required an understanding of how to use and customize these files to meet specific business requirements for data migration.
After the new operating system, drivers, and applications are installed, the final phase of the refresh task sequence involves restoring the captured user state. Task sequence steps connect back to the State Migration Point, retrieve the stored data for that specific computer, and use the USMT tools to restore it to the new operating system environment. When the user logs in for the first time, their desktop background, documents, and other familiar settings are present, creating a much smoother transition.
For computer replacement scenarios, the process is slightly different. The capture can be run as a standalone task sequence on the old computer, storing the data on the SMP. Then, when the new computer is being built, a different task sequence is used that includes the steps to restore the user state. Configuration Manager manages the association between the computers, ensuring that the correct user data is applied to the new machine. This robust capability significantly reduces the manual effort and risk associated with migrating users to new hardware, a common and important task for any IT department.
For network-based operating system deployments, Configuration Manager relies on a Windows Server feature called Windows Deployment Services (WDS). While Configuration Manager manages the entire OSD process, it uses WDS as the underlying engine for handling the PXE boot requests from clients. Understanding this relationship was a key part of the 70-243 Exam. It is important to note that you do not configure WDS directly; Configuration Manager takes ownership of the WDS instance on the server where you install the PXE-enabled distribution point.
When you enable the PXE service point role on a distribution point, the Configuration Manager components will automatically install WDS if it is not already present. It will then configure the WDS service to respond to client requests. Specifically, it configures WDS to point network-booting clients to the appropriate Configuration Manager boot image. An administrator should not use the WDS console to make changes, as they will be overwritten by Configuration Manager, which manages the WDS configuration to ensure the OSD process works correctly.
The PXE boot process involves a series of network communications. A client computer, when set to boot from its network adapter, sends out a DHCP broadcast. The DHCP server responds with an IP address for the client. As part of this exchange, the DHCP server also tells the client the IP address of the PXE server (the WDS/Configuration Manager server). The client then contacts the PXE server, which offers it a Network Boot Program (NBP). The client downloads and executes the NBP, which in turn downloads the Windows PE boot image.
A common challenge, and a frequent troubleshooting topic for the 70-243 Exam, is when the DHCP server and the PXE server are on different subnets. In this case, the initial DHCP broadcast from the client will not reach the PXE server. To solve this, IP Helpers (sometimes called DHCP Relay Agents) must be configured on the network routers. The IP Helpers are configured to forward the DHCP broadcasts from the client subnet to the IP addresses of both the DHCP server and the PXE server. This allows the client to get its IP address and discover the PXE server.
Configuration Manager provides options for controlling which computers can respond to PXE boot requests. For security, you can configure the PXE service point to respond only to known computers, meaning computers that already exist as objects in the Configuration Manager database. You can also require a password for PXE booting. Furthermore, you can deploy a task sequence to the "All Unknown Computers" collection, which allows you to build brand new, out-of-the-box machines that have never been on the network before. Managing these settings is crucial for both functionality and security in an OSD environment.
Go to testing centre with ease on our mind when you use Microsoft 70-243 vce exam dumps, practice test questions and answers. Microsoft 70-243 Administering and Deploying System Center 2012 Configuration Manager (SCCM) certification practice test questions and answers, study guide, exam dumps and video training course in vce format to help you study with ease. Prepare with confidence and study using Microsoft 70-243 exam dumps & practice test questions and answers vce from ExamCollection.
Top Microsoft Certification Exams
Site Search:
SPECIAL OFFER: GET 10% OFF
Pass your Exam with ExamCollection's PREMIUM files!
SPECIAL OFFER: GET 10% OFF
Use Discount Code:
MIN10OFF
A confirmation link was sent to your e-mail.
Please check your mailbox for a message from support@examcollection.com and follow the directions.
Download Free Demo of VCE Exam Simulator
Experience Avanset VCE Exam Simulator for yourself.
Simply submit your e-mail address below to get started with our interactive software demo of your free trial.