CakeResume Talent Search

Advanced filters
On
4-6 years
6-10 years
10-15 years
More than 15 years
United States
Avatar of the user.
Avatar of the user.
Sr. Inspection Process Engineer @Corning Incorporated
2018 ~ Present
Technical support engineer
Within two months
Inspection Process
Machine Learning
Python
Ready to interview
Full-time / Interested in working remotely
10-15 years
National Sun Yat-sen University
Computer Science, Data mining, Database modeling
Avatar of Emily Ledoux.
Avatar of Emily Ledoux.
Principal @Cascade Data Labs
2016 ~ 2022
Director Data
Within two months
Emily Ledoux Delivery Principal Seasoned Delivery Principal in the Data Practice. Focused on designing robust, scalable data ecosystems in the cloud to feed insights and data visualizations. Well-rounded consultant with experience spanning sales, recruiting, and delivery. Proven Delivery & Client Lead. Portland, OR, USA https://www.linkedin.com/in/emily-ledoux/ Work Experience JanuaryPresent Principal Data Architect Kin + Carta Delivery or Client Lead for over 25 resources, including direct reports, delivery oversight, hours tracking, QBRs, onboarding management, budget ownership and related responsibilities. Cloud Architect, designing Azure and
PowerPoint
Word
Excel
Employed
Open to opportunities
Full-time / Interested in working remotely
6-10 years
University of Pennsylvania
Economics
Avatar of Luke Lonergan.
CEO of Big Data Developer, LLC
More than one year
Luke Lonergan Luke Lonergan is a Big Data Expert and successful businessman who provide the help in financial, risk and facilities management, database design and business intelligence. CEO of Big Data Developer, LLC california, US [email protected] Skills Cloud Computing Cloud computing is the on-demand delivery of compute power, database storage, applications, and other IT resources through a cloud services platform via the internet with pay-as-you-go pricing. Product Management Product management is an organizational lifecycle function within a company dealing with the planning, forecasting, and production, or marketing of a product
Enterprise Software
Big Data
latest technology
Open to opportunities
Full-time / Interested in working remotely
4-6 years
Avatar of YVictor.
Avatar of YVictor.
工程師 @永豐金證券
2018 ~ Present
Python Developer, Rust Developer, BigData Engineer
Within one month
師,2018 年 4 月 - 目前 Python 報價下單 API C & C++ Python Binding DevOps Solace Protocol GateWay Framework Design and Implement 交易所報價串接轉發 Rust C & C++ Binding Rust Parser Transbiz backend & IInumbers 木刻思 Data Scientist,2016 年 10 月年 4 月 Transbiz跨境電商 後端工程師 實作分散式架構大型爬蟲 處理資料流與系統可擴展性設計 佈建可自動擴
Python
pytorch
CINEMA 4D
Employed
Full-time / Interested in working remotely
4-6 years
國立東華大學 | National Dong Hwa University
Arts and Creative Industries
Avatar of the user.
Avatar of the user.
Senior Associate - Procurement Services @WNS Global Services
2022 ~ Present
Within three months
Microsoft Power BI
Data Analysis
Microsoft Office
Employed
Full-time / Interested in working remotely
10-15 years
Manonmaniam Sunderanar University
Human Resources
Avatar of Admor Aguilar.
Avatar of Admor Aguilar.
Studio Director / Technical Director @All Aboard Learning
2022 ~ Present
Senior Software Engineer
More than one year
Admor Aguilar • Game Developer (Unity, Unreal) with more than 5 years of experience . • Done projects for PC, Mobile (Android/iOS), VR/AR • Scalable Architectures , Optimization , UI , Gameplay • Programming is one of the greatest superpower humanity can have. It's also an art, and so one should always put their best so that when you zoom out you'll be able to see and say "what a beautiful piece of technology". Game Developer Manila, Philippines [email protected] Skills & Expertise Main Programming: C#, C++ Sub Programming Languages: HTML5, CSS
Unity
C#
Scalable Architectures
Employed
Part-time / Interested in working remotely
4-6 years
De La Salle - College of Saint Benilde
Information Technology Specialized in Game Design and Development
Avatar of the user.
Avatar of the user.
Java Software Engineer @Music Reports Inc
2020 ~ Present
Java Software Engineer
More than one year
Java
JavaScript
JQuery
Full-time / Interested in working remotely
4-6 years
California State University
Computer Science
Avatar of Queen R Mastropietro.
Offline
Avatar of Queen R Mastropietro.
Offline
Coding @National Lumber
2016 ~ 2020
More than one year
At the same time, the management and development approach necessary to learn about and incorporate Big Data into business is still being fleshed out. Processing the vast data in a meaningful way requires a new approach altogether for many companies. Why Big Data Testing Is So Important Big Data is different than traditional data sets, and it requires a substantially distinct method of testing. As the amount of data grows and becomes more complex, big data testing becomes critical for making use of an otherwise overwhelming amount of data. At the same time, testing big data
Development
Enthusiastic
Advanced Communication Skills
Not open to opportunities
Full-time / Interested in working remotely
4-6 years
High School
12
Avatar of Stuart Frost Laguna Niguel.
Avatar of Stuart Frost Laguna Niguel.
Founder & CEO @STUART FROST Background
2019 ~ 2020
Founder & CEO
More than one year
etc. More recently, Stuart has made many connections in China and has been helping some of the portfolio companies raise funds from Chinese investors and develop strategies for the Chinese market. SWARM, Founder & Chairman, Aug 2012 ~ Jul 2018 Stuart founded Maana to build a platform that could use AI to explore massive sets of Big Data and make it much Ø easier and faster for large industrial companies to create advanced analytics applications. The company has raised 65m in venture capital and has a number of very large and successful deployments. The company'...
word
photoshop
service quality
Employed
Not open to opportunities
Full-time / Not interested in working remotely
More than 15 years
Avatar of Wayne J. Schepens.
Avatar of Wayne J. Schepens.
Managing Director @LaunchTech Communications
2015 ~ Present
More than one year
Wayne Schepens PR/Communications Professional in Baltimore, Maryland As the managing director of Baltimore’s LaunchTech Communications, Wayne Schepens builds upon decades of corporate messaging and product management experience with a specific focus in the field of cybersecurity. In addition to his work with LaunchTech, he provides go-to-market campaign support and technology analysis for multiple companies outside of the greater Baltimore area. Wayne Schepens’ specific areas of expertise include cyber threat intelligence, analyst relations, product launches, digital marketing, credibility campaigns, and risks associated with big data. He has spoken regularly on these and other
Strategic Planning
Strategic Communications
Public Relations
Employed
More than 15 years
Virginia Tech
Systems Engineering

The Most Lightweight and Effective Recruiting Plan

Search resumes and take the initiative to contact job applicants for higher recruiting efficiency. The Choice of Hundreds of Companies.

  • Browse all search results
  • Unlimited access to start new conversations
  • Resumes accessible for only paid companies
  • View users’ email address & phone numbers
Search Tips
1
Search a precise keyword combination
senior backend php
If the number of the search result is not enough, you can remove the less important keywords
2
Use quotes to search for an exact phrase
"business development"
3
Use the minus sign to eliminate results containing certain words
UI designer -UX
Only public resumes are available with the free plan.
Upgrade to an advanced plan to view all search results including tens of thousands of resumes exclusive on CakeResume.

Definition of Reputation Credits

Technical Skills
Specialized knowledge and expertise within the profession (e.g. familiar with SEO and use of related tools).
Problem-Solving
Ability to identify, analyze, and prepare solutions to problems.
Adaptability
Ability to navigate unexpected situations; and keep up with shifting priorities, projects, clients, and technology.
Communication
Ability to convey information effectively and is willing to give and receive feedback.
Time Management
Ability to prioritize tasks based on importance; and have them completed within the assigned timeline.
Teamwork
Ability to work cooperatively, communicate effectively, and anticipate each other's demands, resulting in coordinated collective action.
Leadership
Ability to coach, guide, and inspire a team to achieve a shared goal or outcome effectively.
More than one year
Coder
National Lumber
2016 ~ 2020
Lyndhurst, NJ, USA
Professional Background
Current status
Job Search Progress
Not open to opportunities
Professions
Doctor
Fields of Employment
Accounting
Work experience
4-6 years
Management
I've had experience in managing 5-10 people
Skills
Development
Enthusiastic
Advanced Communication Skills
Information Technology
Dependable
Microsoft Excel
Microsoft Office
Dedicated
Committed
Hardworking
Languages
English
Professional
Job search preferences
Positions
Job types
Full-time
Locations
Remote
Interested in working remotely
Freelance
Yes, I'm currently a full-time freelancer
Educations
School
High School
Major
12
Print

Big Data Testing Best Practices

Big Data Testing Strategy

As the world becomes increasingly interconnected and digitized, big data continues to grow as one of the most critical components in organizations both big and small. In fact, according to recent studies, enterprise data is set to grow by more than 600 percent over the next five years, and the vast majority of Fortune 500 Companies are already using Big Data development as a key element within their respective competitive advantages.

At the same time, the management and development approach necessary to learn about and incorporate Big Data into business is still being fleshed out. Processing the vast data in a meaningful way requires a new approach altogether for many companies.

Why Big Data Testing Is So Important

Big Data is different than traditional data sets, and it requires a substantially distinct method of testing. As the amount of data grows and becomes more complex, big data testing becomes critical for making use of an otherwise overwhelming amount of data.

At the same time, testing big data requires an IT team that understands the ever-changing nuances and complexities in a way that is directly applicable to your organization. Successful big data testing will result in dramatically improved efficiency and returns on investment of said data.

Key Objectives When Testing Big Data Applications

In order to fully understand why Big Data testing is so important, it’s worthwhile to break down the key objectives. The following are five of the key objectives that are looked at when testing big data.

Accumulate and Consolidate Data

Data can come from a variety of sources, and having a strategy to accumulate and consolidate that data is an important first step for some companies like on this website - https://jatapp.com/services/mobile-app-development/. Sources like blogs, social media, internal programs, systems and databases need to be vetted and consolidated.

Verify the Architecture

Architecture testing is another essential component of big data testing. Because Big Data architecture contains so many moving parts, each object within the architecture must be verified as a legitimate and integral part of the system.

Eliminate Unnecessary Data

As the name suggests, Big Data consists of a large number of data. However, not all of that data is actually important to an organization in all cases. That’s why eliminating unnecessary data points is a critical step in the process of creating an effective big data application.

Examples of unnecessary data can be duplicate or redundant data sets, corrupted or unreliable data, and data that does not directly correlate with an organization’s particular strategy or objectives.

Test the Historic and Current Performance of Data

Understanding how data has performed in the past, and how it is likely to perform moving forward, is another key element of effective big data testing. This includes verifying the ability of a big data application to accumulate data from a data source, verifying how well the data can be effectively processed, and verifying how efficiently data is stored and cached within the program.

Technology, Smartphone, Telephone, Touchscreen, Screen

Test Data Transformation

Many times before data gets to the target system, there are multiple transformations, aggregation and calculations performed prior to final transformation. During the testing it is critical to test source to target transformation , i.e. to be able to take a raw data from a source file perform all necessary aggregations/calculations manually and be able to verify that the results that one have gotten using manual data manipulation will match the result during the system transformation from source to target. If you did find an issue, it is critical to isolate it to the specific module where the culprit occurs. 

Big Data Testing Best Practices

In order to effectively achieve the key objectives described above, as well as any other objectives an organization requires, there are several best practices all of which are performed routinely by SQA Solution that ensure the most reliable and efficient results.

Create a Special Test Environment for Big Data Testing

When dealing with a wide range of distinct components, having a dedicated test environment can help prevent core data from being corrupted by the testing process.

Create a Specialized Validation Tool

With older databases, it was sometimes tenable to use a “one size fits all” solution for data testing. However, with big data there are so many variables that creating a specialized validation tool is essential.

Main Components Involved in Big Data Testing

When testing big data systems and applications, there are a few primary components that must be a part of any comprehensive testing process.

Upload and Verify Data

The first component of any comprehensive test is the verification and uploading of data from all of the various sources to an HDFS. That data is then vetted for corruption and partitioned into separate data units.

Condense and Consolidate Data

Once the data is uploaded and partitioned, the next key component is to consolidate the data and eliminate redundancies. This ensures that the data is processed efficiently and accurately.

Output Data and Generate Reports

Once all of the data has been vetted and verified, it can be uploaded into a downstream system, which in turn can be used for the generation of reports and other key insights.

The Importance of Expert-Level Execution

With a growing number of companies utilizing Big Data as one of their core competitive advantages, anything less than expert-level execution can be the difference between success and failure. The team at SQA Solution goes through extensive big data testing training to ensure that they are fully-equipped on all of the best practices needed to properly test big data applications. This high level of training is particularly important because the number of records, as well as the complexity of those records, is only getting more complex with each passing year. For additional information about big data testing, or for any other questions, please contact us at CONTACT INFORMATION.

Resume
Profile

Big Data Testing Best Practices

Big Data Testing Strategy

As the world becomes increasingly interconnected and digitized, big data continues to grow as one of the most critical components in organizations both big and small. In fact, according to recent studies, enterprise data is set to grow by more than 600 percent over the next five years, and the vast majority of Fortune 500 Companies are already using Big Data development as a key element within their respective competitive advantages.

At the same time, the management and development approach necessary to learn about and incorporate Big Data into business is still being fleshed out. Processing the vast data in a meaningful way requires a new approach altogether for many companies.

Why Big Data Testing Is So Important

Big Data is different than traditional data sets, and it requires a substantially distinct method of testing. As the amount of data grows and becomes more complex, big data testing becomes critical for making use of an otherwise overwhelming amount of data.

At the same time, testing big data requires an IT team that understands the ever-changing nuances and complexities in a way that is directly applicable to your organization. Successful big data testing will result in dramatically improved efficiency and returns on investment of said data.

Key Objectives When Testing Big Data Applications

In order to fully understand why Big Data testing is so important, it’s worthwhile to break down the key objectives. The following are five of the key objectives that are looked at when testing big data.

Accumulate and Consolidate Data

Data can come from a variety of sources, and having a strategy to accumulate and consolidate that data is an important first step for some companies like on this website - https://jatapp.com/services/mobile-app-development/. Sources like blogs, social media, internal programs, systems and databases need to be vetted and consolidated.

Verify the Architecture

Architecture testing is another essential component of big data testing. Because Big Data architecture contains so many moving parts, each object within the architecture must be verified as a legitimate and integral part of the system.

Eliminate Unnecessary Data

As the name suggests, Big Data consists of a large number of data. However, not all of that data is actually important to an organization in all cases. That’s why eliminating unnecessary data points is a critical step in the process of creating an effective big data application.

Examples of unnecessary data can be duplicate or redundant data sets, corrupted or unreliable data, and data that does not directly correlate with an organization’s particular strategy or objectives.

Test the Historic and Current Performance of Data

Understanding how data has performed in the past, and how it is likely to perform moving forward, is another key element of effective big data testing. This includes verifying the ability of a big data application to accumulate data from a data source, verifying how well the data can be effectively processed, and verifying how efficiently data is stored and cached within the program.

Technology, Smartphone, Telephone, Touchscreen, Screen

Test Data Transformation

Many times before data gets to the target system, there are multiple transformations, aggregation and calculations performed prior to final transformation. During the testing it is critical to test source to target transformation , i.e. to be able to take a raw data from a source file perform all necessary aggregations/calculations manually and be able to verify that the results that one have gotten using manual data manipulation will match the result during the system transformation from source to target. If you did find an issue, it is critical to isolate it to the specific module where the culprit occurs. 

Big Data Testing Best Practices

In order to effectively achieve the key objectives described above, as well as any other objectives an organization requires, there are several best practices all of which are performed routinely by SQA Solution that ensure the most reliable and efficient results.

Create a Special Test Environment for Big Data Testing

When dealing with a wide range of distinct components, having a dedicated test environment can help prevent core data from being corrupted by the testing process.

Create a Specialized Validation Tool

With older databases, it was sometimes tenable to use a “one size fits all” solution for data testing. However, with big data there are so many variables that creating a specialized validation tool is essential.

Main Components Involved in Big Data Testing

When testing big data systems and applications, there are a few primary components that must be a part of any comprehensive testing process.

Upload and Verify Data

The first component of any comprehensive test is the verification and uploading of data from all of the various sources to an HDFS. That data is then vetted for corruption and partitioned into separate data units.

Condense and Consolidate Data

Once the data is uploaded and partitioned, the next key component is to consolidate the data and eliminate redundancies. This ensures that the data is processed efficiently and accurately.

Output Data and Generate Reports

Once all of the data has been vetted and verified, it can be uploaded into a downstream system, which in turn can be used for the generation of reports and other key insights.

The Importance of Expert-Level Execution

With a growing number of companies utilizing Big Data as one of their core competitive advantages, anything less than expert-level execution can be the difference between success and failure. The team at SQA Solution goes through extensive big data testing training to ensure that they are fully-equipped on all of the best practices needed to properly test big data applications. This high level of training is particularly important because the number of records, as well as the complexity of those records, is only getting more complex with each passing year. For additional information about big data testing, or for any other questions, please contact us at CONTACT INFORMATION.