Niraj Singhal

Learn More
The World Wide Web is a huge source of hyperlinked information contained in hypertext documents. Search engines use web crawlers to collect these documents from web for the purpose of storage and indexing. However, many of these documents contain dynamic information which gets changed on daily, weekly, monthly or yearly basis and hence we need to refresh(More)
Load balancing in grid based distributed computing environment increases the availability and scalability of entire system. Dynamic load balancing has the potential to perform better than static load balancing, but they are inevitably more complex. The overhead involved is much more but one can not negate their benefits. Load balancing strategies try to(More)
Due to the lack of efficient refresh techniques, current crawlers add unnecessary traffic to the already overloaded Internet. Frequency of visits to sites can be optimized by calculating refresh time dynamically. It helps in improving the effectiveness of the crawling system by efficiently managing the revisiting frequency of a website; and appropriate(More)
As the size of the web continues to grow, searching it for useful information has become increasingly difficult. Also study reports that sufficient of current internet traffic and bandwidth consumption are due to the web crawlers that retrieve pages for indexing by the different search engines. Moreover, due to the dynamic nature of the web, it becomes very(More)
In mobile ad-hoc networks, congestion occurs with limited resources. The standard TCP congestion control mechanism is not able to handle the special properties of a shared wireless channel. TCP congestion control works very well on the Internet. But mobile ad-hoc networks exhibit some unique properties that greatly affect the design of appropriate protocols(More)
In mobile ADHOC networks have limited bandwidth and are more prone to error than wired networks which further impose limits on the amount of data that can be sent. In order to conserve the limited resources, it is highly desirable that transmission should be as efficient as possible with minimal loss. The objective of congestion control is to limit the(More)
With the tremendous growth of the Internet, World Wide Web has become a huge source of hyperlinked information contained in hypertext documents. Search engines use web crawlers to collect these documents from web for the purpose of storage and indexing. An incremental crawler visits the web for updating its collection. There is a need to regulate the(More)
The web can be viewed as the largest database available and presents a challenging task for effective design and access. With the tremendous growth of Web, the main objective is to provide relevant information to the user to fulfil their needs. Data Mining applied to Web has the potential to be quiet beneficial. Web Mining is the mining of data related to(More)
Biometric is a unique, measurable physiological or behavioural characteristic of a person and finds extensive applications in authentication and authorization. Fingerprint, palm print, iris, voice, are some of the most widely used biometric for personal identification. To reduce the error rates and enhance the usability of biometric system, multimodal(More)
This paper presents a proposed expert to solve the problem of rice growers from seed to seed i.e from selection and sowing of varieties till harvesting of crop. Expert system is a collection of computer programs, developed for a particular area of domain that are capable of offering solutions or advices related to specific problem. An expert system is a(More)