26-02-2010, 04:20 PM
ONLINE IMMIGRATION CONSULTANCY
MINI PROJECT REPORT
AKHIL K SREEJA C A SOUMI C G SHALIN BABY PAUL
1.1 ABOUT THE PROJECT
Online Immigration Consultancy firm specializing in the design, development and integration of custom built business systems and solutions using advanced internet, electronic commerce and client server technology. The system is developed for those people who are interested to go abroad. Currently there are a lot of immigration consultancies all over the country. The consultancies will be working as an intermediary between the candidates and the foreign embassy. There are a lot of steps required for a person to migrate from one country to another. Currently every consultancy will be doing the steps manually.
The main modules of the system are:
Â¢ User module
Â¢ Admin module
User module: User can register their details as online using the system. The points they acquire determine the eligibility of the users to go abroad or to register. These points are generated by the system using the details that the users have entered at the time of registration. The user can also update the details entered. The user can view the status of his application from the system. The users of the system belongs to three categories, they are:
Â¢ Business Class (People interested to do business abroad)
Â¢ Skilled Class (People Interested to go abroad for Job)
Â¢ Educational Class (People interested to study abroad)
Administrator Module: The administrator of the system can create packages, payment types, modify the status of the users, and cancel the registration of the users if necessary .The administrator can enter career details from various companies and can forward the resumes to company
CHAPTER 2 PROBLEM DEFINITION
Currently there are a lot of immigration consultancies all over the country. The consultancies will be working as an intermediary between the candidates and the foreign embassy. There are a lot of steps required for a person to migrate from one country to another. The following are some steps involves in a particular consultancy.
Â¢ Candidate Registration
Â¢ Selecting an Immigration Package
Â¢ Registration Payment
Â¢ Submission of all certificates
Â¢ Points Calculation
Â¢ Sending the details to the Head Office
Â¢ Certificate Scrutiny
Â¢ Sending FSN (File Serial Number)
Â¢ Visa Processing Payment
Â¢ Sending the application and certificates to the Embassy
Â¢ Certificate Scrutiny
Â¢ Sending Reference Number
Â¢ ERC Course
Â¢ Submission of Medical Certificates
Â¢ Sending Visa to the candidates
Currently every consultancy will be doing the steps manually. They need a lot of paper work that consumes more time, human effort and money. Searching are also difficult when they are manually processed. Recovery of data lost by accidental damage of stored papers is not possible in the present system. Taking hard copy backups consumes extra time and money. It is also difficult to know the status of the developments of their immigration for the user.
2.1 EXISTING SYSTEM
The existing system is subjected to close study and the problem areas are identified. The designer now functions as a problem solver and tries to sort out the difficulties that the enterprise faces. The solutions are given as a proposal. The proposal is then weighed with the existing system analytically and the best one is selected. The proposal is presented to the user for an endorsement by the user. The current system will do all the steps manually. The candidate registration step will require the respected candidates to come to their office and register themselves in the consultancy. It will take a fair amount of time to do these steps. After registering their details, the candidates will need to submit their full certificates. After this step the initial payment for the immigration process should be done by the candidate. After this the application for each candidate will be prepared by the office. The application along with the certificates will be sent to the head office. The certificates will be sent for scrutiny. Then the candidate will be given a FSN (File serial Number). All these steps will require large amount of time. Suppose the candidate wants to view the current status of his application, he needs to come to the office and check the status. If one candidate wants to cancel the immigration process he should come to the office to do the formalities. These are some of the disadvantages of the current systemference. Even the other staff members can make quick entries if the responsible person is not present.
2.2 PROPOSED SYSTEM
The proposed system overcomes all the above-mentioned problems. All the paper works needed in the existing system can be avoided. Searching and ranking of the applicants are made easy as they are done automatically. Making backups, automatic recovery is the added features of the proposed system. Through online registration, the applicant's registration purpose is made easier. The proposed system is much helpful for the candidates. The steps which take many steps can be done very easily by the proposed system. The system provides very high security for the data of each and every candidate. The candidates can be able to register their details to a particular consultancy from any where by using this system. Since the system is a website, it can be accessible to any user form any where. The candidates will be given a unique ID and Password by the time of registration. Thus the system provides high security. Other than this, the candidate is able to check the current status of his application from this facility. The candidate is able to send any questions to the authority with this system. The candidate is also able to cancel the immigration process by using this system.
The system study phase involves the initial investigation of the structure of the System, which is currently in use, with the objective of identifying the problem and difficulties with the existing system. The major steps involved in this phase included defining the user requirements and studying the present system to verify the problem. The performance expected by the new system was also defined in this phase in order to meet the user requirements. The information gathered from various documents were analyzed and evaluated and the findings reviewed in order to establish specific system objectives.
3.1 SYSTEM ANALYSIS
System analysis is the way of studying a system with an eye on solving its problem-using computer. It is the most essential part of the development of a project of a system analysis. System analysis consists of system element, process and technology.
To analyze a system, has to study the systems in details. The analyst has to understand the functioning and concept of the system in detail, before design the appropriate computer based system that will meet all the requirements of the existing system. The system analyst has to carry out a customary approach to use the computer for problem solving.
System analysis includes the following basic concepts
Â¢ Preliminary investigation
Â¢ Requirements specification
Â¢ Feasibility study
Â¢ Detailed investigation
Â¢ Drawing up of strategies
Â¢ Design and coding
Â¢ Testing and training
The above steps constitute the logical framework for the system analysis. After the preliminary investigation and feasibility study, the scope of the defined and comparable items are set forth and hence detailed investigation is executed. This allows the system analyst to comprehend the full scope of the project. Soon after the implementation of the newly developed system, followed by the training of the users, the system analysis is included.
A request to receive assistance from information system can be made for many reasons, but in case a manager, employee or system specialist initiates the request. When that request is made, the first system activity preliminary investigation begins. The activity has three parts
> Request clarification: the request from employee may not be well stated. Sometimes the request may not be well defined. Therefore before any system investigation can be considered, the project request must be examined to determine precisely the actual requirements of the organization.
> Feasibility study: the basic idea of feasibility study is to determine whether the requested project is feasible.
> Request approval: all projects that are requested are not desirable or feasible. Some organizations receive so many projects requests from employee that only a few of them can be pursued. However those projects that are feasible and desirable should put into a schedule. The management decides request that are most important. After a project request is approved the cost priority, the completion time and the personal required are estimated. Once the request is approved, the collection of data and determination of requirements can be started.
CHAPTER 5 REQUIREMENT SPECIFICATION
The primary goal of the system analyst is to improve the efficiency of the existing system. For that the study of specification of the requirements is very essential. For the development of the new system, a preliminary survey of the existing system will be conducted. Investigation done whether the upgradation of the system into an application program could solve the problems and eradicate the inefficiency of the existing system.
5.1 FEASIBILITY STUDY
The initial investigation points to the question whether the project is feasible. A feasibility is conducted to identify the best system that meets all the requirements. This includes an identification description, an evaluation of the proposed systems and selection of the best system for the job
The requirements of the system are specified with a set of constraints such as system objectives and the description of the outputs. It is then duty of the analyst to evaluate the feasibility of the proposed system to generate the above results. Three key factors are to be considered during the feasibility study.
5.1.1 Operation Feasibility
An estimate should be made to determine how much effort and care will go into the developing of the system including the training to be given to the user. Usually, people are reluctant to changes that come in their progression. The computer initialization will certainly affected the turn over, transfer and employee job status. Hence an additional effort is to be made to train and educate the users on the new way of the system.
5.1.2 Technical Feasibility
The main consideration is to be given to the study of available resources of the organization where the software is to be implemented. Here the system analyst evaluates the technical merits of the system giving emphasis on the performance, Reliability, maintainability
By taking the consideration before developing the proposed system, the resources availability of the organization was studied. The organization was immense computer facilities equipped with sophisticated machines and the software hence this technically feasible.
5.1.3 Economic Feasibility
Economic feasibility is the most important and frequently used method for evaluating the effectiveness of the proposed system. It is very essential because the main goal of the proposed system is to have economically better result along with increased efficiency. Cost benefit analysis is usually performed for this purpose. It is the comparative study of the cost verses the benefit and savings that are expected from the proposed system. Since the organization is well equipped with the required hard ware, the project was found to be economically.
5.2 HARDWARE REQUIREMENTS
PROCESSOR : PENTIUM II
CLOCK SPEED : 800 MHZ
SYSTEM BUS : 32 BIT
RAM : 128 MB
HDD : 5GB
MONITOR : SVGA COLOR
KEY BOARD : 108 KEYS
MODEM : 56 KBPS
MOUSE : SERIAL
FDD : 1.44 MB
5.3 SOFTWARE REQUIREMENTS
OPERATING SYSTEM BROWSER
DATABASE LAYER SERVER SIDE SCRIPTING CLIENT SIDE SCRIPTING
: WINDOWS XP
: INTERNET EXPLORER 5.5 : MS SQL Server 2000
: ASP.NET : HTML
: TCP / IP
OR ANY HTTP BROWSER
5.4 TECHNOLOGY SPECIFICATION
o Client-Server Architecture
Typical client-server systems are based on the 2-tiered architecture, whereby there is a clear separation between the data and the presentation/business logic. These are generally data driven, with the application existing entirely on the client machine while the database server is deployed somewhere in the organization.
o 2-Tier Architecture
In a traditional 2- Tiered application, the processing load is given to the client PC while the server simply acts as a traffic controller between the application and data. As a result, not only does the application performance suffer due to the limited resources of the PC, but the network traffic tends increase as well.
o 3- Tier Architecture
In 3- Tier architecture an application is broken into three separate logical layers, each with a well defined set of interfaces. The first tier is referred to as the presentation layer and typically consists of graphical user interface of some kind. The middle tier, or business layer, consists of application or business layer and the third layer- the data layer contains the data that is needed for the application. The middle tier is basically the code that the user calls upon to retrieve the desired data. The presentation layer then receives the data and formats it for display. This separation of application logic from the user interface adds enormous flexibility to the design of application.
o n- Tier Architecture
In an n - tier architecture the application logic is divided by function rather than physically. N - Tier architecture then breaks down like this:
> A user interface that handle the user's interaction with the application; this can be web browser running through a firewall, a heavier desktop application or even a wireless device
> Presentation logic that defines what the user interface displays and how a user's requests are handled- depending on what user interfaces are supported we need to have slightly different versions of the presentation logic to handle the client appropriately.
> Business logic that models the application's business rules, often through the interaction with the application's data.
> Interface services that provide additional functionality required by the application components, such as messaging, transactional support etc.
> The Data layer where the enterprise's data resides.
CHAPTER 6 SYSTEM DESIGN
System design is the solution to the creation of a new system. This phase is composed of several systems. This phase focuses on the detailed implementation of the feasible system. It emphasis on translating design specifications to performance specification. System design has two phases of development logical and physical design.
During logical design phase the analyst describes inputs (sources), out puts (destinations), databases (data sores) and procedures (data flows) all in a format that meats the uses requirements. The analyst also specifies the user needs and at a level that virtually determines the information flow into and out of the system and the data resources. Here the logical design is done through data flow diagrams and database design.
The physical design is followed by physical design or coding. Physical design produces the working system by defining the design specifications, which tell the programmers exactly what the candidate system must do. The programmers write the necessary programs that accept input from the user, perform necessary processing on accepted data through call and produce the required report on a hard copy or display it on the screen.
6.1 LOGICAL DESIGN
Logical design of an information system shows the major features and also how they are related to one another. The first step of the system design is to design logical design elements. This is the most creative and challenging phase and important too. Design of proposed system produces the details of the state how the system will meet the requirements identified during the system analysis that is, in the design phase we have to find how to solve the difficulties faced by the existing system. The logical design of the proposed system should include the details that contain how the solutions can be implemented. It also specifies how the database is to be built for storing and retrieving data, what kind of reports are to be created and what are the inputs to be given to the system. The logical design includes input design, output design, and database design and physical design
6.2 INPUT DESIGN
The input design is the link between the information system and the user. It comprises the developing specification and procedures for data preparation and those steps are necessary to put transaction data into a usable form for processing data entry. The activity of putting data into the computer for processing can be achieved by inspecting the computer to read data from a written or printed document or it can occur by having people keying the data directly into the system.
The system needs the data regarding the asset items, depreciation rates, asset transfer, physical verification for various validation, checking, calculation and report generation.. The error raising method is also included in the software, which helps to raise error message while wrong entry of input is done. So in input design the following things are considered.
What data should be given as input
Â¢ How the data should be arranged or coded
Â¢ The dialogue to guide the operating personnel in providing input.
Â¢ Methods for preparing input validations and steps to follow when error occur
Â¢ The samples of screen layout are given in the appendix.
6.3 OUTPUT DESIGN
Computer output is the most important and direct information source to the user. Output design is a process that involves designing necessary outputs in the form of reports that should be given to the users according to the requirements. Efficient, intelligible output design should improve the system's relationship with the user and help in decision making. Since the reports are directing referred by the management for taking decisions and to draw conclusions they must be designed with almost care and the details in the reports must be simple, descriptive and clear to the user.
Â¢ Determine what information to present
Â¢ Arrange the presentation of information in an acceptable format
Â¢ Decide how to distribute the output to intended receipts
Depending on the nature and future use of output required, they can be displayed on the monitor for immediate need and for obtaining the hardcopy. The options for the output reports are given in the appendix.
6.4 PHYSICAL DESIGN
The process of developing the program software is referred to as physical design. We have to design the process by identifying reports and the other outputs the system will produce. Coding the program for each module with its logic is performed in this step. Proper software specification is also done in this step.
6.5 MODULAR DESIGN
A software system is always divided into several sub systems that makes it easier for the development. A software system that is structured into several subsystems makes it easy for the development and testing. The different subsystems are known as the modules and the process of dividing an entire system into subsystems is known as modularization or decomposition.
A system cannot be decomposed into several subsystems in any way. There must some logical barrier, which facilitates the separation of each module. The separation must be simple but yet must be effective so that the development is not affected.
The system under consideration has been divided into several modules taking in consideration the above-mentioned criteria. The different modules are
1) Login Module
2) Registration Module.
3) Administration Module
4) User Module
The overall objective in the development of database technology has been to treat data as an organizational resource and as an integrated whole. DBMS allow data to be protected and organized separately from other resources. Database is an integrated collection of data. The most significant form of data as seen by the programmers is data as stored on the direct access storage devices. This is the difference between logical and physical data.
Database files are the key source of information into the system. It is the process of designing database files, which are the key source of information to the system. The files should be properly designed and planned for collection, accumulation, editing and retrieving the required information.
The organization of data in database aims to achieve three major objectives: -
Â¢ Data integration.
Â¢ Data integrity.
Â¢ Data independence.
The proposed system stores the information relevant for processing in the MS SQL SERVER database. This database contains tables, where each table corresponds to one particular type of information. Each piece of information in table is called a field or column. A table also contains records, which is a set of fields. All records in a table have the same set of fields with different information. There are primary key fields that uniquely identify a record in a table. There are also fields that contain primary key from another table called foreign keys.
Normalization is a technique of separating redundant fields and braking up a large table in to a smaller one. It is also used to avoid insertion, deletion and updating anomalies. All the tables have been normalized up to the third normal form. In short the rules for each of the three normal forms are as below.
Â¢ First normal form
A relation is said to be in 1NF if all the under lying domain of attributes contain simple individual values.
Â¢ Second normal form
The 2NF is based on the concept of full functional dependency. A relation said to be in 2NF if and only if it is in 1NF and every non-key attribute is fully functionally dependent on candidate key of the table.
Â¢ Third normal form
The 3NF is based on the concept of transitive dependency. A relation in 2NF is said to be in 3NF if every non-key attribute is non-transitively.
CHAPTER 8 SYSTEM IMPLEMENTATION
Implementation includes all those activities that take place to convert from the old system to the new. The old system consists of manual operations, which is operated in a very different manner from the proposed new system. A proper implementation is essential to provide a reliable system to meet the requirements of the organizations. An improper installation may affect the success of the computerized system.
8.1 IMPLEMENTATION METHODS:
There are several methods for handling the implementation and the consequent conversion from the old to the new computerized system.
The most secure method for conversion from the old system to the new system is to run the old and new system in parallel. In this approach, a person may operate in the manual older processing system as well as start operating the new computerized system. This method offers high security, because even if there is a flaw in the computerized system, we can depend upon the manual system. However, the cost for maintaining two systems in parallel is very high. This outweighs its benefits.
Another commonly method is a direct cut over from the existing manual system to the computerized system. The change may be with in a week or with in a day. There are no parallel activities. However, there is no remedy in case of a problem. This strategy requires careful planning.
A working version of the system can also be implemented in one part of the organization and the personnel will be piloting the system and changes can be made as and when required. But this method is less preferable due to the loss of entirety of the system.
8.2 IMPLEMENTATION PLAN:
The implementation plan includes a description of all the activities that must occur to implement the new system and to put it into operation. It identifies the personnel responsible for the activities and prepares a time chart for implementing the system. The implementation plan consists of the following steps.
o List all files required for implementation.
o Identify all data required to build new files during the implementation. o List all new documents and procedures that go into the new system.
The implementation plan should anticipate possible problems and must be able to deal with them. The usual problems may be missing documents; mixed data formats between current and files, errors in data translation, missing data etc.
EDUCATION AND TRAINING
The implementation of the proposed system includes the training of system operators. Training the system operators includes not only instructions in how to use the equipment, but also in how to diagnose malfunctions and in what steps to take when they occur. So proper training should be provided to the system operators. No training is complete without familiarizing users with simple system maintenance activities. Since the proposed system is developed in a GUI, training will be comparatively easy than systems developed in a non-GUI. There are different types of training. We can select off-site to give depth knowledge to the system operators.
Success of the system depends on the way in which it is operated and used. Therefore the quality of training given to the operating person affects the successful implementation of the system. The training must ensure that the person can handle all the possible operations.
Training must also include data entry personnel. They must also be given training for the installation of new hardware, terminals, how to power the system, how to power it down, how to detect the malfunctions, how to solve the problems etc. the operators must also be provided with the knowledge of trouble shooting which involves the determination of the cause of the problem.
The proposed system requires trained personnel for operating the system. Data entry jobs must be done utmost carefully to avoid errors. This will reduce the data entry errors considerably. It is preferable to provide the person with some kind of operating manuals that will explain all the details of the system.
9.1 POST IMPLEMENTATION REVIEW
After the system is implemented, a review should be conducted to determine whether the system is meeting expectations and where improvements are needed. System quality, user confidence and operating systems statistics are accessed through such technique event logging, impact evaluation and attitude surveys. The review not only assesses how well the proposed system is designed and implemented, but also is a valuable source of information that can be applied to a critical evaluation of the system.
The reviews are conducted by the operating personals as well as the software developers in order to determine how well the system is working, how it has been accepted and whether adjustments are needed. The review of the system is highly essential to determine the future enhancements required by the system. The system can be considered successful only if information system has met it objectives. The review analyses the opinion of the employees and identifies the attitudes towards the new computerized system. Only when the merits and demerits of the implemented system are known, one can determine what all additional features it requires are. The following are the issues to be considered in the evaluation of the system.
System testing is a critical aspect of Software Quality Assurance and represents the ultimate review of specification, design and coding. Testing is a process of executing a program with the intent of finding an error. A good test is one that has a probability of finding an as yet undiscovered error. The purpose of testing is to identify and correct bugs in the developed system. Nothing is complete without testing. Testing is the vital to the success of the system.
In the code testing the logic of the developed system is tested. For this every module of the program is executed to find an error. To perform specification test, the examination of the specifications stating what the program should do and how it should perform under various conditions.
Unit testing focuses first on the modules in the proposed system to locate errors. This enables to detect errors in the coding and logic that are contained within that module alone. Those resulting from the interaction between modules are initially avoided. In unit testing step each module has to be checked separately.
System testing does not test the software as a whole, but rather than integration of each module in the system. The primary concern is the compatibility of individual modules. One has to find areas where modules have been designed with different specifications of data lengths, type and data element name.
Testing and validation are the most important steps after the implementation of the developed system. The system testing is performed to ensure that there are no errors in the implemented system. The software must be executed several times in order to find out the errors in the different modules of the system.
Validation refers to the process of using the new software for the developed system in a live environment i.e., new software inside the organization, in order to find out the errors. The validation phase reveals the failures and the bugs in the developed system. It will be come to know about the practical difficulties the system faces when operated in the true environment. By testing the code of the implemented software, the logic of the program can be examined. A specification test is conducted to check whether the specifications stating the program are performing under various conditions. Apart from these tests, there are some special tests conducted which are given below:
Peak Load Tests: This determines whether the new system will handle the volume of activities when the system is at the peak of its processing demand. The test has revealed that the new software for the agency is capable of handling the demands at the peak time.
Storage Testing: This determines the capacity of the new system to store transaction data on a disk or on other files. The proposed software has the required storage space available, because of the use of a number of hard disks.
Performance Time Testing: This test determines the length of the time used by the system to process transaction data.
In this phase the software developed Testing is exercising the software to uncover errors and ensure the system meets defined requirements. Testing may be done at 4 levels
Â¢ Unit Level
Â¢ Module Level
Â¢ Integration & System
10.1 UNIT TESTING
A Unit corresponds to a screen /form in the package. Unit testing focuses on verification of the corresponding class or Screen. This testing includes testing of control paths, interfaces, local data structures, logical decisions, boundary conditions, and error handling. Unit testing may use Test Drivers, which are control programs to co-ordinate test case inputs and outputs, and Test stubs, which replace low-level modules. A stub is a dummy subprogram.
10.2 MODULE LEVEL TESTING
Module Testing is done using the test cases prepared earlier. Module is defined during the time of design.
10.3 INTEGRATION & SYSTEM TESTING
Integration testing is used to verify the combining of the software modules. Integration testing addresses the issues associated with the dual problems of verification and program construction. System testing is used to verify, whether the developed system meets the requirements.
10.4 REGRESSION TESTING
Each modification in software impacts unmodified areas, which results serious injuries to that software. So the process of re-testing for rectification of errors due to modification is known as regression testing. Installation and Delivery
Installation and Delivery is the process of delivering the developed and tested software to the customer. Refer the support procedures Acceptance and Project Closure
Acceptance is the part of the project by which the customer accepts the product. This will be done as per the Project Closure, once the customer accepts the product; closure of the project is started. This includes metrics collection, PCD, etc.
Maintenance is making adaptation of the software for external changes (requirements changes or enhancements) and internal changes (fixing bugs). When changes are made during the maintenance phase all preceding steps of the model must be revisited.
There are three types of maintenance:
1. Corrective (Fixing bugs/errors)
2. Adaptive (Updates due to environment changes)
3. Perfective (Enhancements, requirements changes)
Table 12.1 ADMINLOGIN
FIELD TYPE CONSTRAINTS
EMPID INT PRIMARY KEY
The above table stores administrator login details.
Table 12.2 COUNTRY
FIELD TYPE CONSTRAINTS
CTRYID INT PRIMARYKEY
The above table stores details of country.
FIELD TYPE CONSTRAINTS
The above table stores the details of payment details
FIELD TYPE CONSTRAINTS
SID INT PRIMARY KEY
The above table stores the details of memory action
FIELD TYPE CONSTRAINTS
The above table stores the details of payment
FIELD TYPE CONSTRAINTS
SID INT PRIMARY KEY
The above table stores the details of registration
Fig 13.1 LEVEL 0 DFD
Fig. 13.2 LEVEL 1 DFD
Fig, 13.3 LEVEL 2 DFD
Fig 13.4 LOGIN FORM
Users can login through the above form.
3 Global Immigration Consultants User Edit Page - Microsoft Internet Explorer
File Edit View Favorites Tools Help
<^Back - - Ã‚Â© g] fjj | ^Search (^Favorites t^Media Q| 1^- # Ã‚Â£ 1 - a
Address | _. http7/'node3/gicnew/userhonne.hl:rn J (f>Go
Hello., You can make changes into your registered data. There is provision for Add Work Details change/add new qualification; experience, marital status etc. You just select the option and do whatever you want to do,
View Skilled Details
Hai Welcome ITianU
Fig 13.6 USER HOME
User operations are performed through the above table
Fig 13.7 CHECK PAYMENT
User can check their payment here.
Fig 13.8 CHECK STATUS
User can check their status here.
Fig 13.9 ADMIN HOME
Administrator can enter details here.
Fig 13.10 ADMIN CHECK PAYMENT
Administrator can check user payments.
Fig 13.11 CHANGE STATUS Administrator can change user status.
The project report entitled "ONLINE IMMIGRATION CONSULTANCY" to its final stage. The system has been developed with much care that it is free of errors and at the same time it is efficient and less time consuming. The important thing is that the system is robust. Also provision is provided for future developments in the system. The entire system is secured. This online system will be approved and implemented soon.
MICROSOFT .NET FRAMEWORK
Microsoft designed VB.NET from the ground up to take advantage of its new .NET Framework. The .NET Framework is a multi-language environment for building, deploying, and running XML Web services and applications.
The .NET Framework is made up of four parts, the Common Language Runtime, a set of class libraries, a set of programming languages, and the ASP.NET environment. The .NET Framework was designed with three goals in mind. First, it was intended to make Windows applications much more reliable, while also providing an application with greater degree of security. Second, it was intended to simplify the development of Web applications and services that not only work in the traditional sense, but on mobile devices as well. Lastly, the framework was designed to provide a single set of libraries that would work with multiple languages. . NET has been designed with multiple platform support as a key feature. This means that code written using the .NET Framework can run on all versions of Windows. Rather than restricting the class libraries available in .NET to cover functionality that's only available on all platforms.
It is expected that .NET will run on other platforms such as UNIX, through it is unlikely that the whole of the .NET Framework will be supported, probably just the languages and the base class libraries. Even if Microsoft does deliver .NET on a non-functionality available to .NET applications on these platforms is likely to be reduced.
The four components of the .NET Framework
Â¢ Common Language Runtime
One of the design goals of .NET Framework was to unify the runtime engines so that all developers could work with a set of runtime services. The .NET Framework's solution is called the Common Language Runtime (CLR). The CLR provides capabilities such as memory management, security, and robust error handling to any language that work with the .NET Framework. The CLR enables languages to inter operate with one another. Memory can be allocated by code written in one language and can be freed by code written in another language. Similarly, errors can be raised in one language and processed in another language.
The CLR provides many core services for applications. The CLR can provide these services due to the way it manages code execution.
Â¢ Garbage Collection: it is a CLR feature that automatically manages memory on behalf of an application.
Â¢ Code Verification: it is a process that ensures all code is safe to run prior to execution.
Â¢ Code access security: it allows code to be granted or denied permissions to do things, depending on the security configuration for a given machine, the origins of the code, and the metadata associated with types that the code is trying to use. The primary purpose of this of this feature is to protect users from malicious code that attempts to access other code residing on a machine.
> .NET FRAMEWORK CLASS LIBRARY
The .NET Framework provides many classes that help developers re-use code. The .NET Class Libraries contain code for programming topics such as threading, file I/O, database support, XML parsing, and data structures such as stacks and queues. This entire class library is available to any programming languages that support the .NET Framework. Because all languages now support the same runtime, they can re-use any class that works with the .NET Framework. This means that any functionality available to one language will also be available to any other .NET language.
> NET PROGRAMMING LANGUAGES
The .NET Framework provides a set of tools that help to build code that works with the .NET Framework. Microsoft provides a set of languages that are already .NET compatible. VB .NET is one of those languages.
ASP.NET provides a powerful server side control architecture. ASP.NET builds on the programming classes of the .NET Framework, providing a Web application model with a set of controls and infrastructure that make it simple to build ASP Web applications. ASP.NET includes a set of controls that encapsulate common HTML user interface elements, such as text boxes and drop-down menus. These controls run on the Web server, however, and push their user interface as HTML to the browser. On the server, the controls expose an object-oriented programming model that brings the richness of object-oriented programming to the Web developer. ASP.NET also provides infrastructure services, such as session state management and process recycling that further reduce the amount of code a developer must write and increase application reliability. Using XML Web services features, ASP.NET developers can write their business logic and use the ASP.NET.
To get great performance and remove the active scripting dependency, ASP.NET pages assemblies (DLLs). The basic process shown in figure. When a page is first requested, ASP.NET compiles the page into an assembly. The assembly contains a single generated class that derives from the System.Web.UI.Page class. It contains all the code needs to generate the page, and is instantiated by the frame work to process a request each time the .aspx page is requested.
The page compilation process isn't cheap, and can take a few seconds for complex pages. However, the compilation is only ever done once for each .aspx file. All subsequent requests for the page -even after IIS has been restarted - are satisfied by instantiating the class generated, and asking it to render the page. This result is great performance. The only cost is a little disk space on the web servers.
> BENEFITS OF ASP.NET The main goals of ASP.NET:
Â¢ Make code cleaner
Â¢ Improve deployment, scalability, security, and reliability
Â¢ Provide better support for different browsers and devices
Â¢ Enable a new breed of web applications
> SERVER CONTROLS
ASP.NET is designed around the concept of server controls. This stems from the fundamental changes in the philosophy for creating interactive pages. In particular, with the increasing power of servers and the ease of building multi-server web-frames.
> SERVER CONTROL HIERARCHY
The server controls are logically broken down into a set of families
Â¢ HTML Server Controls: The server equivalents of the HTML controls. They create output that is the same as the definition of the control within the page, and they use the same attributes as the standard HTML lament.
Â¢ Web Form Controls: A set of controls that are equivalent of the normal HTML <form> controls, such as a textbox,, a hyperlink, and various buttons. They have a standardized set of property names that make life easier at design-time, and easier for graphical page creation tools to build the page.
Â¢ List Controls: These controls provide a range of ways to build lists. These lists can also be data bound. In other words, the content of the list can come from a data source such as an Array, a Hash Table, or a range of other data sources. The range of controls provides many different options, and some include special features for formatting the output and even editing the data in the list.
Â¢ Rich Controls: Produce the rich content and encapsulate complex functionality, and will output pure HTML or HTML and Script.
Â¢ Validation Controls: A set of special controls designed to make it easy to check and validate the values entered into other controls on a page. They perform the validation client side, server side, or both, depending on the type of client device that requests the page.
> ASP.NET SECURITY OPTIONS
ASP.NET Provides a range of different options for implementing security and restricting user access in a web application. All these options are configured within the web.config file located in the root folder of the application.
ASP.NET provides itself provides three types of authentication and authorization, through the first of these options (windows) does rely on IIS to do all the work for us.
Â¢ Windows built in authentication: The initial authentication is performed by IIS through Basic, Digest, or Integrated windows authentication. The web.config file can specify the accounts that are valid for the whole or parts of the application.
Â¢ Passport-based authentication: This option uses a centralized Web-based authentication service provided by Microsoft, which offers single - sign - on [SSN] and core profile server for member site.
Â¢ Forms based authentication: Unauthenticated requests are automatically redirected to an HML form page using HTTP client-side redirection. The client browser sends the cookie with all subsequent requests, and the user can access the application while they retain this cookie.
Default [IIS] authentication: The default impersonation can still be used. But access control is limited to that specified within IIS. Resources are accessed under the context of the ASP.NET process account, or the IUSER account if impersonation is enabled.
MS SQL 2000
The Microsoft SQL Server 2000 database has been selected as the database of choice of the Internet Information Server and Active Server Pages. When you store data in SQL Server you stored data in tables. Tables in turn, are stored in database. Finally databases are stored in table in database device. When we install SQL Server, it creates a master database device. The master database device contains the entire system database used internally by SQL Server, such as the master and temp database for our use; we can create database device and database of our own.
> FEATURES OF SQL SERVER 2000
Microsoft SQL Server 2000 features include:
> Internet Integration
The SQL Server 2000 database engine includes integrated XML support. It also has the scalability, availability, and security features required to operate as the data storage component of the largest Web sites. The SQL Server 2000 programming model is integrated with the Windows DNA architecture for developing Web applications, and SQL Server 2000 supports features such as English Query and the Microsoft Search Service to incorporate user-friendly queries and powerful search capabilities in Web applications.
> Scalability and Availability
The same database engine can be used across platforms ranging from laptop computers running Microsoft WindowsÃ‚Â® 98 through large, multiprocessor servers running Microsoft Windows 2000 Data Center Edition. SQL Server 2000 Enterprise Edition supports features such as federated servers, indexed views, and large memory support that allow it to scale to the performance levels required by the largest Web sites.
> Enterprise-Level Database Features
The SQL Server 2000 relational database engine supports the features required to support demanding data processing environments. The database engine protects data integrity while minimizing the overhead of managing thousands of users concurrently modifying the database. SQL Server 2000 distributed queries allow you to reference data from multiple sources as if it were a part of a SQL Server 2000 database, while at the same time, the distributed transaction support protects the integrity of any updates of the distributed data. Replication allows you to also maintain multiple copies of data, while ensuring that the separate copies remain synchronized. You can replicate a set of data to multiple, mobile, disconnected users, have them work autonomously, and then merge their modifications back to the publisher.
> Ease of installation, deployment, and use
SQL Server 2000 includes a set of administrative and development tools that improve upon the process of installing, deploying, managing, and using SQL Server across several sites. SQL Server 2000 also supports a standards-based programming model integrated with the Windows DNA, making the use of SQL Server databases and data warehouses a seamless part of building powerful and scalable systems. These features allow you to rapidly deliver SQL Server applications that customers can implement with a minimum of installation and administrative overhead.
> Data warehousing
SQL Server 2000 includes tools for extracting and analyzing summary data for online analytical processing. SQL Server also includes tools for visually designing databases and analyzing data using English-based questions.
The developed system is flexible and changes can be made easily. The system is developed with an insight into the necessary modification that may be required in the future. Hence the system can be maintained successfully without much rework.
One of the main future enhancements of our system is to add a mail response from the administrator to the user. So that the user can understand whether he/she is eligible for migration, directly from the administrator.
1. Professional ASP.NET 1: Alex Homer, Dave Sussman, Rob Howard, Brain Francis, Karli Watson, Richard Anderson -Wiley Publishing, Inc- First Edition.
2. Windows 2000 Server Administrator's Bible: Jeffrey R. Shapiro, Jim Boyce - Wiley Publishing, Inc- First Edition.
3. The SQL Server 2000: Anthony Sequeira, Brian Alderman, Second Edition
4. Software Engineering Concept: Richard Fairley, TATA McGraw-HILL EDITION