Quantcast
Channel: Tech Tutorials
Browsing all 892 articles
Browse latest View live
↧

Speculative Execution in Hadoop

Speculative execution in Hadoop is an option to run a duplicate map or reduce task for the same input data on an alternative node. This is done so that any slow running task doesn’t slow down the whole...

View Article


Image may be NSFW.
Clik here to view.

MapReduce Flow in YARN

How a MapReduce job runs in YARN is different from how it used to run in MRv1. This post shows how MapReduce v2 runs internally in YARN Hadoop.Submitting the job First step is of course submitting the...

View Article


Data Compression in Hadoop

When we think about Hadoop, we think about very large files which are stored in HDFS and lots of data transfer among nodes in the Hadoop cluster while storing HDFS blocks or while running map reduce...

View Article

Compressing File in bzip2 Format in Hadoop - Java Program

This post shows how to compress an input file in bzip2 format in Hadoop. The Java program will read input file from the local file system and copy it to HDFS in compressed bzip2 format.Input file is...

View Article

Image may be NSFW.
Clik here to view.

Data Locality in Hadoop

In this post we’ll see what is data locality in Hadoop and how it helps in minimizing the network traffic and increases the overall throughput of the cluster.Data locality optimizationTo understand...

View Article


Compressing File in snappy Format in Hadoop - Java Program

This post shows how to compress an input file in snappy format in Hadoop. The Java program will read input file from the local file system and copy it to HDFS in compressed snappy format. Input file...

View Article

How to Compress Intermediate Map Output in Hadoop

In order to speed up the MaReduce job it is helpful to compress the intermediate map output in Hadoop. Since output of the map phase is- Stored to disk. Mapper output is transferred to the reducers on...

View Article

How to Compress MapReduce Job Output in Hadoop

You can choose to compress the output of a Map-Reduce job in Hadoop. You can configure to do it for all the jobs in a cluster or you can set properties for specific jobs. Configuration parameters for...

View Article


Input Splits in Hadoop

When a MapReduce job is run to process an input data one of the thing Hadoop framework does is to divide the input data into smaller chunks, these chunks are referred as input splits in Hadoop.For each...

View Article


How to Configure And Use LZO Compression in Hadoop

In this post we’ll see how to configure and use LZO compression in Hadoop. Since LZO is GPL licensed it doesn't come bundled with Hadoop installation. You will have to install it separately. By...

View Article

Lazy Initializing Spring Beans

By default, in Spring framework, all the singleton beans are eagerly created and configured by ApplicationContext as part of the initialization process. Though this behavior of pre-instantiation is...

View Article

How to Inject Null And Empty String Values in Spring

This post shows how you can inject null or empty String as a value for any property in Spring framework.Injecting empty stringIf you are trying to inject an empty string as a value for any property in...

View Article

Image may be NSFW.
Clik here to view.

Circular Dependency in Spring Framework

In Spring framework if you are predominantly using constructor injection then it is possible to create a circular dependency scenario.What is circular dependencyCircular dependency in Spring happens...

View Article


BeanPostProcessor in Spring Framework

The BeanPostProcessor interface in the Spring framework defines callback methods that you can implement to perform some logic on the bean instances.If you want to implement some custom logic after the...

View Article

registerShutdownHook() Method in Spring Framework

In the post Spring Bean Life Cycle we have already seen that you can provide destroy methods for your beans to do clean up. One problem is that these destroy callback methods are not executed...

View Article


@Required Annotation in Spring Framework

In your Spring bean you may have some fields for which you want to ensure that those fields have been populated. Spring framework has a @Required annotation to ensure that.@Required annotation in...

View Article

ApplicationContextAware And BeanNameAware Interfaces in Spring Framework

If a class implements org.springframework.context.ApplicationContextAware interface, then the object instance of that class is provided with a reference to the ApplicationContext that it runs...

View Article


Benefits, Disadvantages And Limitations of Autowiring in Spring

Though autowiring in Spring makes life easy by wiring the dependencies automatically thus requiring less code to write but at the same time autowiring has few limitations and disadvantages too and that...

View Article

Excluding Bean From Autowiring in Spring

You can exclude a bean from autowiring at per-bean basis in Spring framework. If you are using Spring XML configuration then you can exclude a bean from autowiring by setting the autowire-candidate...

View Article

BeanFactoryAware Interface in Spring Framework

If a class implements org.springframework.beans.factory.BeanFactoryAware interface, then the class bean is provided with a reference to their owning BeanFactory.BeanFactoryAware interface has a single...

View Article
Browsing all 892 articles
Browse latest View live