Difference Between Contact Hours & PDUs

There are many PMP aspirants who are still confused about which one is needed Contact Hours or PDUs in order to apply for the PMP Exam.  So, is it 35 contact hours you need and not 35 PDUs to apply for PMP? Let me explain if you are still more confused about these terms.
In a simple language –
Contact Hours – are hours that are needed to earn before becoming PMP certified. It can only be earned as a traditional course time that focuses on project management. If you are PMP exam aspirant then you will need at contact hours certificate (at least 35 contact hours certificate which is mandatory credential for PMP exam conducted by PMI).
People are also confused whether it is necessary to earn 35 contact hours from PMI R.E.P only. The answer is – “NO”.  PMI has made it clear in its handbook about the different ways to earn the 35 contact hours.
How to earn 35 Contact Hours?

  • Contact Hours are earned by attending Project Management relevant instruction.
  • Any class you took at any age could be counted as Contact Hours if it had any PM instructions
  • If your organization provided any training to use the software they use, you can claim that as Contact Hours
  • Attend PMP training from a Registered Education Provider (R.E.P) who is eligible to award you with Contact Hours.
  • You can easily earn this with Trainings24x7

PDUs (Professional Development Units) – PDUs are earned when you become PMP certified. You do not need Contact Hours after becoming PMP certified. There are some ways to earn PDUs to maintain your PMP. You may refer PMI Handbook for PDU calculations of your respective certification and it does provide detailed information of each activity.
(PMP)® is the globally recognized certification of project management knowledge, skills and of course experience as well. So it’s not surprising that PMI takes a great care to make sure the quality of its certification process, from application through post certification.
PDU’s must be earned only from PMI R.E.P (Registered Education Provider) as clearly mentioned in the PMI Handbook.

Hope, this article has helped you and cleared your all confusion about both terms. Best of luck for PMP Exam!

Related Blogs:

Earning PDUs Are Changing From Dec, 2015 – What You Need To Know

Is It Necessary To Memorize ITTO To Pass The PMP Exam?

Top 5 IT Certifications in 2015 That Pay Higher Salary

Advertisements

List Of Free PMP Sample Questions Websites | Trainings24x7

Are PMP exam ready and looking for PMP FREE Sample Questions OR Simulators for practice? If YES, then in this article I’m going to give you a list of free PMP sample questions websites or resources from where you can get more than 1000 practice PMP questions for free. Below you will have the list of the best websites that will help you prepare PMP questions with answer.
I will also list the best reference books for preparing the exam.

It is universally true that – to obtain the PMP certification, you need to do lots of practice on realistic questions. Recommendation is given to do at least 1000 of such questions before actually passing the exam. That is why I have listed here the list of websites from where you can get more that 1000 FREE questions and their answers. You must practice them until you achieve more than 80 percent of the questions.

Golden Tip before practicing questions: – Read the PMBOK 5th Edition at least 5 times to familiarize yourself with PMI’s vision and terminology.
Websites providing free PMP certification exam questions:
FREE-PMP-SAMPLE

(For more link)

BIG DATA HADOOP TARINING | TRAININGS24x7

BIG DATA HADOOP CERTIFICATION TARINING

Course Overview 
This big data course is designed to enhance your skill and knowledge to become a successful Hadoop developer. Core concepts will be covered in the course along with implementation on varied industry cases. This big data course is designed to enhance your skill and knowledge to become a successful Hadoop developer. Core concepts will be covered in the course along with implementation on varied industry cases.

Training Objectives:
Course Objectives
By the end of the course,  you will:
1. Master the concepts of HDFS and MapReduce framework
2. Understand Hadoop 2.x Architecture
3. Setup Hadoop Cluster and write Complex MapReduce programs
4. Learn data loading techniques using Sqoop and Flume
5. Perform data analytics using Pig, Hive and YARN
6. Implement HBase and MapReduce integration
7. Implement Advanced Usage and Indexing
8. Schedule jobs using Oozie.
9. Implement best practices for Hadoop development
10. Work on a real life Project on Big Data Analytics
Course Curriculum of Big data and Hadoop

  1. Understanding Big data and Hadoop  
    .        Understanding of Big data.
    ·        Limitations and Solutions of existing Data Analytics Architecture.
    ·        Hadoop feature and Hadoop Ecosystem.
    ·        Hadoop 2.x core Components.
    ·        Hadoop storage: HDFS and Hadoop Processing: MapReduce.
    ·        Anatomy of File write and Read and Rack Awareness.

    2.    Hadoop Architecture and HDFS 
    ·        HadoopInstallation and Configuration in system.
    ·        Common Hadoop Shell Commands.
    ·        Hadoop Configuration file.
    ·        Password-Less SSH.
    ·        Hadoop copy Commands.

    3.    HadoopMapReduce Framework  |
    ·        MapReduce Use Cases.
    ·        Traditional way VsMapReduce way.
    ·        Hadoop 2.x MapReduce Architecture and components
    ·        Demo on MapReduce.
    ·        Input Splits.
    ·        Relation between InputSplit and HDFS Blocks.
    ·        MapReduce job submission Flow.
    ·         MapReduce: Combiner and Partitioner..
    ·         Map-Side Joins
    ·        Reduce-Side Joins

    4.    Pig 
    ·        Pig Installation and Configuration.
    ·        About Pig.
    ·        MapReduceVs Pig.
    ·        Use Cases.
    ·        Programming structure in Pig.
    ·        Pig Running Modes.
    ·        Pig Components and Pig execution.
    ·        Pig Latin Program.
    ·        Data Models and Data types in Pig.
    ·        Relations Operators and File Loaders.
    ·        Group and COGROUP Operators.
    ·        Joins and COGROUP in Pig.
    ·        Union.
    ·        Diagnostic Operators..

    5.     Hive        
    ·        Hive Installation and Configuration.
    ·        About Hive and Use Cases
    ·        Hive Vs Pig
    ·        Hive Architecture and Components.
    ·        MetaStore in Hive and Limitations of Hive.
    ·        Comparison with Traditional Databases.
    ·        Hive data types and data models.
    ·        Partitions and Buckets.
    ·        Hive Tables (Managed Tables and External Tables).
    ·        Importing Data.
    ·        Querying Data.
    ·        Hive scripts.
    ·        Hive QL: Joining Tables.
    ·        Hive dynamic partitioning.
    ·        Buckting

    6.    Hbase    
    ·       Introduction to NoSQL database and Hbase.
    ·       Hbase Installation and Configuration.
    ·       HbaseVs RDBMS.
    ·       Hbase Components and Architecture.
    ·       Hbase Data Models and Shell.
    ·       Data Loading Techniques and Filter in Hbase.
    ·       Introduction to Zookeeper and Data Model.
    ·        Zookeeper Service.
    ·       Hands on Bulk data loading.
    ·       Introduction on Oozie.

    7.       Data Loading Technique using Sqoop.
    ·        Load table into MySQL
    ·        Import MySQL to HDFS
    ·        Export Table HDFS to MySQL
    ·        Import part of table MYSQL to HDFS
    ·        Import selected Data From Table to HDFS
    ·        Import Data HDFS to Hive
    ·        Cloudera data to Windows
    ·        Windows Data to Cloudera
    ·        Local System data to cloudera