Title of test:
TBW 50

TBW 50 2013

Other tests from this author

Creation Date: 27/11/2013

Category: Driving Test

Number of questions: 80
Share the Test:
New CommentNuevo Comentario
No comments about this test.
Layers of the Layered Scalable Architecture Enterprise Data Mart, Data Propagation, Quality & Harmonization, Business Transformation, Reporting, Virtualization Data Aquisition, Data Cleansing and Transformation, Quality & Harmonization, Reporting, Virtualization Data Propagation, Data Acquisition, Data Quality & Harmonization, Reporting Data Acquisition, Data Propagation, Data Cleansing and Transformation, Reporting, Virtualization Virtualization, Reporting, Data Propagation, Data Acquisition, Data Quality & Harmonization, Business Transformation.
Unite Technologies used to load data to BW and their respective Source BW Service API DB Connect UD Connect File Web Service Staging BAPI.
InfoSources should be used in the following scenario When executing two transformations before data is written to the InfoProvider If you want to upload data from SAP in several DataSources 7x Whenever you receive data from Data Services Before joining two DSO into an InfoCube.
Process Chains are used to deal with dependencies between objects in a BW flow True False.
Of how many transformation rules consists a transformation At least 1 Maximum 99 12 8.
The ETL process is a list of steps that raw data must follow to be extracted, transformed and loaded into BW targets True False.
The following is NOT true about a DataSource (DaSo): A DaSo is a BW object used to extract and stage data from source systems Can operate several types of routines Subdivide the data provided into business areas Contains a number of logically related fields and data to be transferred into SAP BW.
The following are true about the start process of a Process Chain (PC): Only the start process can be schedulled without a predecessor process It can be the successor of another process in the PC At least one start process is necessary in a PC The same start process can only be used in one PC.
Match the process feature and its respective example: Process Type Process Variant Process Instance.
What can we access from transaction RSPC? Planning View Check View Infoprovider View DataSource/Infopackage Menu Log View.
Process Chains may be generated for Real-time data Acquisition Hybrid Providers Semantically Partitioned Objects Data lineage Multi-Providers.
What is the transaction for maintaining a Process Chain?.
The maintenance of a process variant is specific for every process type True False.
Process Chains cannot be transported within a transport landscape. True False.
When the data to be edited is attached to an instance (ex. DTP), you can only Repair a process Repeat a process Delete a process Restart a process.
Map the concept with the definition, on the right: Transformation Rule Rule Type Transformation Type Rule Group.
Example of a use of a Start Routine: Deletion of records not required for updating. After determining 'Material Category' for particular material, material of type 'Refund' is not updated. Validation of checks of records after tranformations When want to place an operation where there are not sufficient functions available.
Example of a use of an End Routine: To change date format from yyyy/mm/dd to name_of_day/name_of_month/yyyy Buffering tables into internal tables that can be used for transformation rules After determining "Country of birth" of an employee, every country of type "Large" is not updated. Deletion of records not required for updating after PSA is filled.
Currency translation... ...allows you to translate data records from the source currency to the target currency in the data target. ...is always available for transformation purposes ...is normally performed using predefined translation types ...exists when both the source and the target key figure currency are fixed.
You should use a Central Units of Measure (T006) to get conversion factors when... ...you want conversions within the same dimension ...you have InfoObject-specific conversions between units of different dimensions ...if most of the source and the target unit of measure belong to the same dimension ...if most of the source and the target unit of measure don't belong to the same dimension.
You should use a referrence InfoObject to get conversion factors when... ...you want conversions within the same dimension ...you have InfoObject-specific conversions between units of different dimensions ...if most of the source and the target unit of measure belong to the same dimension ...if most of the source and the target unit of measure don't belong to the same dimension.
In creating a Unit Conversion Type, entering source and target quantities is mandatory. True False.
In BW 3.X, the Infopackage can be used to: Request data from the source system Distribute data within BI Define the fields to be extracted Separate data before going into an InfoSource.
The following is true about migrating a 3.X DataSource to a 7.X It is mandatory nowadays. In the original system, a DataSource is generated The 3.X DataSource, the metadata objects and the transfer structure are deleted Any InfoPackage or PSA existent in the 3.X are replaced After migration, only the information about how data is loaded into the PSA is used in the InfoPackage.
In releases prior to SAP NetWeaver 2004s: There were more than one data flow approach between two persistent objects a Transfer Rule, a DSO and an Update rule stood between a source and a target object There was just one data flow approach between two persistent objects a Transfer Rule, an InfoSource and an Update rule stood between a source and a target object.
The following tasks are recommended, if not mandatory before migration to 7.X takes place: Activate the 3.X Datasource Populate the PSA of the 3.X Datasource Perform emulation Transport the migrated DataSource.
You can perform migration of a 3.X DataSource in the standard way for XML DataSources: True False.
Virtual Providers, used in the "Direct Access" functionality are: InfoObjects with master data that is stored in the object itself InfoObjects with master data that is not stored in the object itself InfoProviders with transactional data that is not stored in the object itself InfoProviders with transactional data that is stored in the object itself.
For Direct Access, the following "types" of Virtual Providers (VP) are available: VP based on Data Transfer Processes VP with BAPIs VP with function modules VP using Web Service VP with Data Services.
In direct access, use a Virtual Provider based on a DTP if: You only need historical data from SAP Source System You only access small data from time to time Few users will execute queries simultaneously on the database You have a DSO and an IC in the dataflow.
In direct access, if you want to access data from non-SAP source systems, use: a Virtual Provider with function modules a Virtual Provider with UD Connect or DB connect a Virtual Provider with DTP a Virtual Provider with BAPIs.
If you have small hierarchies that do not usually change, in BEx Queries, you can use direct access True False.
Prerequisits you have to fulfill to model dataflows for hierarchies with direct access: Place hierarchies in the respective InfoObject Enhance the Source System for 7.02 version or higher Release 7.X DataSources for Direct Access Assign the InfoObject to an InfoArea.
In regard to Datasources, if you want to use Real-time data acquisition (RDA), you need to: Ensure that BI Content DataSources support RDA Select the 'Real-Time Enabl.' indicator in the case of generic datasources Migrate all 3.X DataSources Enable transformations to incorporate RDA.
Real-time data acquisition (RDA) can be used in two primary scenarios: via Service-API via Web Service via Data Services using a Virtual Provider to access data faster.
In real-time data acquisition, the navigational attributes of a characteristic in aggregates cannot be used... ...because they cannot read to real-time updates ...because characteristics do not support navigational attributes ...because navigational attributes can only be used in a 3.x dataflow ...because drill down is not necessary for navigational attributes.
Which one is true about daemons: A daemon is a background process that processes the IP and DTP assigned to it at regular intervals Extracts data from the source system, and transfers it to the PSA table and DSO Informs the Service-API in the Source system when the data for the Source System has been successfully updated. Works as a trigger for initializing the Real-time Data-Acquisition process.
With Real-Time data acquisition, the PSA, DSO, and change log requests remain open across several load processes. True False.
Use transaction RSRDA to Create a Real-time Data Acquisition flow Get an overview of the status of each daemon Monitor loading of data to the PSA and the DSO Create the daemon used in Real-time Data Acquisition.
Can you connect a SAP BW system with another SAP BW system?.
S-API is a technology package in the ______ that enables tight integration of data transfer from ___________ into ______. Oracle database (...) SAP Source system (...) BW Web service system (...) Oracle Database (...) BW SAP Source System (...) SAP Source System (...) BW.
All objects in BI content are available in version A M D G.
What happens when you activate an object in M version? An object of G version is generated and "replaces" the older, A version of that object. The older, A version of that object is "replaced". The corresponding objects are generated in the ABAP Dictionary and programs The corresponding objects are generated in the SAP Library.
Link the step with its correct order in the process of transferring BI Content objects 1st 2nd 3rd.
BI Content DataSources can be transferred in 2 ways: Transfer and activate via SBIW in the Source System Activate Datasources remotely from the BW (subject to authorization check) Activate daemon to transfer field Datasources from the Source System Use RFC connection in BW to transfer.
Reasons for using generic data extraction: BI Content includes the DataSource needed There is a DataSource in BI Content that requires enhancement, but BI provides everything needed. The application features its own generic extraction method You use standard SAP-created programs to fill tables in the SAP system.
Which components can source data for the generic DataSource SAP source system extractors Transparent tables/ Database views Daemon, in the case of Real-time Data Acquisition SAP Query InfoSet.
What is usually used when enhancing a DataSource?.
Delta Management: Refers to the ability to extract modified data records to BW in a new request Ensures no delay in staging data in Data Warehousing Ensures that only relevant data is transferred to BW Can be applied to any connection and source system linked do the BW System.
The Delta Queue is an S-API function, so, it is only usde in SAP or SAP BW Source Systems. True False.
Where can you check if a DataSource supports a delta process? RSA6 / SBIW (Post processing DataSource) In the execution tab of the Infopackage of the DataSource In the Update mode chosen of an Infopackage In the maintenance screen of the Datasource, in BW.
Connect the technical name with the respective the delta process ABR ADD AIM / AIMD.
Use table ___________ to check how data is extracted using a DataSource.
What are the two tables where the metadata of a DataSource is stored when it is replicated? ROOSOURCE RSOLTSOURCE RDSD RSDS.
Link the update mode (defined in an InfoPackage) with its description. Full Update Initialization of the Delta Process Delta Update Repeat Delta Update Early delta initialization Switch InfoPackage in Process Chain to Delta.
The initialization run of a delta process includes: Creation of a request regarding the data selected in the InfoPackage. The option selected regarding the Initialization is saved The Delta Queue is generated in the Source System The Delta Queue is generated in BW.
What is the use of the "Initialize without Data Transfer" option? Begin the Real-Time data Acquisition without transferring data to an InfoCube directly. Test the Direct Access connection to the Source System, with display-only data To initialize a delta process even without having a dataset to transfer To begin the Process Chain without any new changes in the Source System.
Why would you use Early Delta Initialization? To begin the Delta Process at sunrise. To avoid a period when no updates can be made in the Source System. To begin the delta process before scheduled. To transfer data even if there was no document processing in the Source System.
To which feature of Delta Management belongs this process? 1. The selection for the delta initialization is saved and the Delta Queue is generated in the Source System. 2. All the data that corresponds to the selection criteria determined is requested. Delta Process Early delta initialization Initialization without transfer Delta Update.
How can you reduce the volume of data in the delta process initialization? By transferring historical data beforehand by delta update By transferring historical data beforehand by full update By not document processing more information in the Source System By reducing the criteria chosen in a full update in the InfoPackage.
In which table can you check the properties of a DeltaProcess? RSADMIN PCMON RODELTAM DELTAMON.
Connect the Property of a delta process with its definition Delta Type Record Modes Serialization.
Which one of these is not a possible delta type value: A D N F.
In the D delta type, the DataSource determines the delta through the extractor on request. True False.
In the 'F' delta type, the delta records are loaded by flat file, and the Delta Queue is not used. True False.
Connect the record mode value with its meaning: ' ' 'X' 'D' 'R' 'N'.
If a DataSource only "sends" an after image, this must first be updated to a DSO that is in 'overwrite' mode. True False.
What would you use if you want to transfer line items in FI-AR? An AIE delta process with delta type 'E' (Pull) An ABR delta process with delta type 'D' (Push) AIE delta process with delta type 'A' ADD delta process with delta type 'D'.
Which formats are supported in the transfer of flat files? ASCII XML CSV XLS.
Which of these are functions in Flat File DataSources that ensure stable data extraction? Include header rows Routine for naming the physical file Extensive preview functions Space separator adjustment functionality.
How is data from a flat file filtered before entering BW? By selecting the corresponding values from the selction list for the desired filter fields. By going through a filter included in BO DataServices. By adjusting the selection criteria in the InfoPackage maintenance. By using an RFC connection that ensures that data is filtered in the seleted way.
Connect the dots correctly. FIL0 Delta Data FIL1 Delta Data.
The UD Connect component is based on SAP BW Java Connectors. Both persisent and stagient data are supported here. True False.
In how many components is the UD connect architecture divided? (use a number).
You must do the following before use the DB functions: Enable DB connection in BW Install SAP-Specific DBSL Make metadata known BW using DataSources Open a specialized RFC connection Install DB Client.
In the context of XML data acquisition... ...data is generally transferred into BW by means of a data request. ...the data transfer is controlled externally by sending data to BW. ...data is immediately sent to a DSO. ...real-rime data aquisition is supported.
Which objects are generated when a Web Services DataSource is activated? RFC-capable function module An underlying Infopackage SOAP-compatible Web Service A delta queue for changed data.
Which of these are features you can find using transaction WSADMIN? Setting features for the SOAP runtime The proxy that links BW with the internet Generation of the WSDL for released Web Services Calling the Web Services Homepage for testing purposes.
To reach "Quality of Service", what do you have to guarrantee? On-time delivery 100% satisfaction Serialization Transactional Integrity.
Which of these are benfits of using Data Services as a Source System? Extend BW's connectivity to virtually any DataSource Improve Process Chain performance Ensure trustworthiness of data loaded into BW Identify the flow from the source to the target in a transparent way.
Report abuse