International Journal of Engineering and Technology

Published by Engg Journals Publications
Online ISSN: 0975-4024
Remote sensing is a technology to acquire data for disatant substances, necessary to construct a model knowledge for applications as classification. Recently Hyperspectral Images (HSI) becomes a high technical tool that the main goal is to classify the point of a region. The HIS is more than a hundred bidirectional measures, called bands (or simply images), of the same region called Ground Truth Map (GT). But some bands are not relevant because they are affected by different atmospheric effects; others contain redundant information; and high dimensionality of HSI features make the accuracy of classification lower. All these bands can be important for some applications; but for the classification a small subset of these is relevant. The problematic related to HSI is the dimensionality reduction. Many studies use mutual information (MI) to select the relevant bands. Others studies use the MI normalized forms, like Symmetric Uncertainty, in medical imagery applications. In this paper we introduce an algorithm based also on MI to select relevant bands and it apply the Symmetric Uncertainty coefficient to control redundancy and increase the accuracy of classification. This algorithm is feature selection tool and a Filter strategy. We establish this study on HSI AVIRIS 92AV3C. This is an effectiveness, and fast scheme to control redundancy.
Effect of annealing on anisotropy 
Rolling mill machine 
Influence of hot rolling range
Rate of horns according to the final thickness.
Yield stress at 0.1% of plastic strain for the proposed procedure 
This work proposes a method for the optimisation of the parameters during the rolling of aluminium sheets in order to reduce its anisotropy index that is harmful during aluminium stamping works. The method suggested takes into account the evolution of the anisotropy index of a metal according to the hardening rate before annealing anddependent on the alloy type, the mode of reheating, the ranges of rolling and the conditions of annealing, as well as the variability of the various parameters. The results obtained show that the nisotropy index of products was reduced from 8% to less than 4% for aluminium sheets with thicknesses from 1.2 mm to 2.8 mm.
In this paper, a 16-bit Digitally Controlled Oscillator (DCO) is designed in 0.35 µm technology. The 16-bit DCO consists of 4-stage differential delay cell in ring structure, and a digital control scheme is used to improve noise characteristics. The structure of the DCO utilizes dual delay path techniques to achieve high oscillation frequency and a wide tuning range. The DCO circuit is simulated in SPICE at supply voltage of 3.5 V. DCO achieves a controllable frequency range of 1.9512-4.4239 GHz with a tuning range of 2.4727 GHz (56%), with the variation in control word from ‘FFFFH’ to ‘0000H’. The range of measured output noise is –176.562 to –181.979 dB/Hz, with the variation in control word from ‘FFFFH’ to ‘0000H’.
Computer generated images and animations are getting more and more common. They are used in many different contexts such as movies,mobiles, medical visualization, architectural visualization and CAD. Advanced ways of describing surface and light source properties are important to ensure that artists are able to create realistic and stylish looking images. Even when using advanced rendering algorithms such as ray tracing, time required for shading may contribute towards a large part of the image creation time. Therefore both performance and flexibility is important in a rendering system. This paper gives a comparative study of various 3D Rendering techniques and their challenges in a complete and systematic manner.
RESULTS (VIBRATION AMPLITUDE VS FREQUENCY) VI. CONCLUSION A smart beam was constructed using a Lucite beam, PZT actuator, and PVDF sensor. A dSPACE controller card was installed and integrated with related electronics to create an active control setup. Experiments were conducted to control the Vibration response to broadband disturbance. A 30% reduction in 1
Vibration of a smart beam is being controlled. This smart beam setup is comprised of actuators and sensors placed at the root of a cantilever beam. Vibrations can be caused by various sources includinghuman activity and nearby motorized equipment. In this case, disturbance is produced using a white noise signal to the actuator. The piezoelectric sensors are used to detect the vibration. Simultaneously, feedback controller sends correction information to the actuator that minimizes the vibration. To optimize results, controllers were designed using Linear Quadratic Gaussian (LQG)theory. This theory generally results in high-order controllers. Additionally, optimal control theory is being used to directly optimize low-order controllers.
Bypass Recovery In case of without coverage the Deadend is overcome by the Focus phase of our algorithm. The following figure explains the Spray Select Focus routing Fig 4 &5 with and without coverage and dead ends.
Spray Select Focus Without Deadends.
Spray Select Focus With Deadends.
Routing the packets efficiently in mobile ad hoc network does not have end to end paths. Multiple copies are forwarded from the source to the destination. To deal with such networks, researches introduced flooding based routing schemes which leads to high probability of delivery. But the flooding based routing schemes suffered with contention and large delays. Here the proposed protocol “Spray Select Focus”, sprays a few message copies into the network, neighbors receives a copy and by that relay nodes we are choosing the shortest route and then route that copy towards the destination. Previous worksassumption is that there is no contention and dead ends. But we argue that contention and dead ends must be considered for finding efficiency in routing. So we are including a network which has contention and dead ends and we applied the proposed protocol. We can say that this protocol works well for the contention based network.
1 System architecture
2 Module 1-Block diagram
3 Module 2-Block diagram
With the passage of time, technology has merged itself with the daily life of humans. We have seen so much progress in the field of science and technology but we are not able to make full use of it. One such area for improvement is the Electricity board billing system. Our existing electricity board billing system in India is obsolete and time consuming. We are proposing a system through which electricity billing becomes fully automated and communication is made possiblevia wireless networks. The existing manual system in India has major drawbacks. This system is prone to errors and can also be easily manipulated. The prevailing manual system also requires lot of human workforce. The major disadvantage in this system is that the meter cannot be accessed by the meter reader if the customer is not present at home. In our system the central EB office has immediate access to all consumer homes in a locality with the help of an RF system. The EB meter present in each house is connected by wireless network with the EB office which periodically gets updates from the meter. The EB office using a backend database calculates the amount to be paid according to the number of units consumed and sends it back to the meter for display and also to the user’s mobile phone. The advantages of the proposed system make the existing system incompetent. It ispossible to connect to remote areas even when there is a power failure as it employs wireless technology. The new system is user friendly, easy to access and far more efficient than the existing system.
Flow chart of PSO algorithm
Effect of Information Sharing of PSO and Modified PSO (APSO)
This research paper presents a new evolutionary optimization model based on the particle swarm optimization (PSO) algorithm that incorporates the flocking behavior of a spider. The search space is divided into several segments like the net of a spider. The social information sharing among the swarms are made strong and adaptive. The main focus is on the fitness of the swarms adjusting to the learning factors of the PSO. The traditional Particle Swarm Optimization algorithms converges rapidly during the initial stage of a search, but in course of time becomes steady considerably and can get trapped in a local optima. On the other hand in the proposed model the swarms are provided with the intelligence of a spider which enables them to avoid premature convergence and also help them to escape from local optima. The proposed approaches have been validated using a series of benchmark test functions with high dimensions. Comparative analysis with the traditional PSO algorithm suggests that the new algorithm significantly improves the performance when dealing with multimodal functions.
Objective of this paper is to study the character recognition capability of feed-forward back-propagation algorithm using more than one hidden layer. This analysis was conducted on 182 different letters from English alphabet. After binarization, these characters were clubbed together to form training patterns for the neural network. Network was trained to learn its behavior by adjusting the connection strengths on every iteration. The conjugate gradient descent of each presented training pattern was calculated to identify the minima on the error surface for each training pattern. xperiments were performed by using one and two hidden layers and the results revealed that as the number of hidden layers is increased, a lower final mean square error is achieved in large number of epochsand the performance of the neural network was observed to be more accurate.
Variation of split tensile strength with temperature
The present study investigates the benefits of stabilizing the stone mastic asphalt (SMA) mixture in flexible pavement with shredded waste plastic. Conventional (without plastic) and the stabilized SMA mixtures were subjected to performance tests including Marshall Stability, tensile strength and compressive strength tests. Triaxial tests were also conducted with varying percentage bitumen by weight of mineral aggregate (6% to 8%) and by varying percentage plasticby weight of mix (6% to 12% with an increment of 1%). Plastic content of 10% by weight of bitumen is recommended for the improvement of the performance of Stone Mastic Asphalt mixtures. 10% plastic contentgives an increase in the stability, split tensile strength and compressive strength of about 64%, 18% and 75% respectively compared to the conventional SMA mix. Triaxial test results show a 44% increase in cohesion and 3% decrease in angle of shearing resistanceshowing an increase in the shear strength. The drain down value decreases with an increase in plastic content and the value is only 0.09 % at 10% plastic content and proves to be an effective stabilizing additive in SMA mixtures.
Framework of the combined web mining model.
Phases of the CRISP-DM process model.  
Knowledge Management Platform
A Flow Diagram of Electronic/Manual Submission of Tax Forms.
This paper reports the development of a model for taxation. This model will work for the tax payers as well as for the administrator. It utilizes the technique of web mining, text mining, data mining and human experience knowledge for creating a knowledge base of taxation. All knowledge from each part is saved in knowledge base through a knowledge management platform. Using this knowledge management platform the administrator and tax payer can retrieve knowledge;send feedback on the basis of actions suggested. This model facilitates to monitor the knowledge management platform. Its application shows the utilization of model for tax administration .Using this model administrator can improve the quality ofdecisions.
In this study, the pixel-based and object-oriented image classification approaches were used for identifying different land use types in Karnal district. Imagery from Landsat-7 ETM with 6 spectral bands was used to perform the image classification.Ground truth data were collected from the available maps, personal knowledge and communication with the local people. In order to prepare land use map different approaches: Artificial Neural Network(ANN) and Support Vector Machine (SVM) were used. For performing object oriented classification eCognition software was used. During the object oriented classification, in first step several differentsets of parameters were used for image segmentation and in second step nearest neighbor classifier was used for classification. Outcome from the classification works show that the object-oriented approach gave more accurate results (including higher producer’s and user’s accuracy for most of the land cover classes) than those achieved by pixelbased classification algorithms. It is also observed that ANN performed better as compared to SVM classification approach.
First we start with design and development. Let’s take the military as an example. Suppose the military wants to design new weapon systems that will be more accurate than systems they currently have. The first things they have to do is get a team of scientists and whatever other personnel are needed to design the system. Then, after they come up with a design they are pleased with they have to getthe engineers to develop it. All these people either have to be hired or contracted. Some of the work maybe sub contracted. Then there is the acquisition of the materials needed. Some of the materials may already be on hand while others have to be purchased. Some will be purchased domestically and others will need to be purchased fromoverseas. In many cases bids will be put in to various companies to see who can supply the materials at the cheapest cost.After the materials are acquired there’s the matter of storage. In many cases the materials is either of so large a quantity or so large in size that the storage facilities need to be specially built in order to adequately provide the needed space. This of course has to be figured in to the equation. Then there’s the matter of movement and distribution to may be the various military bases around the country or even overseas. Trucks or planes need to be acquired in order to distribute the materials if there aren’t enough readymade transportation vehicles. Transportation costs alone can be astronomical.
Scanning electron micrograph of polyurethane microcapsule containing stannous octoate.  
Scanning electron micrograph of polyurethane microcapsule containing dibutyl tin dilaurate  
Self healing property is the ability of a material to be able to heal damages automatically and autonomously. It has wide range of application from paint coating, anti-corrosion coating, space-shuttle material, construction (concrete), automotive, etc. Microcapsules containing reactive compound for use in self healing polymers are successfully fabricated via interfacial polymerization of polyurethane (PU). The possibility of glycerol as polyol monomer for polyurethane microcapsule shell in the preparation of PU prepolymer was studied. In this research, we also studied encapsulated self ealing agent using IPDI, stannous octoate, dibutyl tin dylaurate. FTIR analysis showed that obtained polyurethane prepolymer still haveunreacted isocyanate group necessary for interfacial polymerization of polyurethane. The morphology of polyurethane microcapsules containing IPDI was observed by scanning electron microscopy and shows spherical microcapsule with wrinkled surface but no gglomeration was found. The morphology of polyurethane microcapsules containing stannous octoate was also in spherical form but have a tendency of agglomerating and so were microcapsules containing dibutyl tin dilaurate. The average microcapsule size was 12.33; 28.59; and 25.65 μm for microcapsule containing IPDI, stannous octoate, and dibutyltin dilaurate respectively. The smallest averageparticles size (12.33 μm) was observed in microcapsules containing IPDI with narrow particle size distribution so the particles were more homogenous than the others.
The heat treatments have been designed to vary the matrix microstructure in P/M processed SiC reinforced 7xxx Aluminum alloys to determine the effects of matrix microstructure, interface behavior on the mechanical properties. Smooth tensile, notched bend tests wereperformed. The results conclude that the clustered regions have been observed as preferred initiation sites in both tensile and notched bend experiments on the SiC / Al-alloy composites. Despite the relative similarity in macroscopic tensile properties between the under aged and over aged composite, quantitative fractography evealed preference for SiC fracture in the under aged composite and preference for interface or near-interface failure in the over aged composite.
Multi-level analytical structure  
Evaluating web sites is a significant task for web service becauseevaluated web sites provide useful information for users to estimate sites’ validation and popularity. Thus, we develop a method to obtain “Evaluating web site based on grey clustering theory combined with AHP”, we evaluate websites and try to make the evaluation ractically, so that users can easily find a good website from WWW and access to high quality data and good services that attract users. Hence, evaluation methods for the effectiveness of web sites are a critical issue in both practice and research the study has nvestigated web log data with the proposed method. Also we conducted experiments and confirmed the effectiveness of our approach and its potential in performing high quality web site evaluation.
Total turbine Power output vs Vane angles at injection angle (Ø) =30 o and speed of rotation 2500 rpm 
Total turbine Power output vs Vane angles at injection angle (Ø) =45 o and speed of rotation 2500 rpm 
Total turbine Power output vs Vane angles at injection angle (Ø) =60 o and speed of rotation 2500 rpm 
Globally faster consumption of hydrocarbon fuel in the transport sector is posing threat to environmental and ecological imbalances and thereby depletion of hydro-carbon fuel is causing another challenge to oil reserves. In view of these issues extensive researches are being carried out to explore the alternative energy source and or to find out appropriate energy conversion system. The atmospheric air once compressed, it is found as potential working fluid to produce shaft work for an air turbine and releases almost zero pollution in the environment. This paper details the athematical modeling of a small capacity compressed air driven novel multi-vanes type air turbine. Effect of expansion of high pressure air collected between two consecutive vanes at different vane angles and varying inlet pressure have been analyzed here. The study shows that the total shaft power is found optimum at injection angle 60o when vane angle θ=36o (10 vanes) and it reduces at injection angle 45o when vane angle θ=51.4o (7 vanes) and further goes down at injection angle 30o when vane angle θ=60o-72o (6-5 vanes), for injection pressure 6 bar and speed of rotation 2500 rpm.
To simulate the gross weight of aircraft and optimise its performance by using various historical data for the specified mission profile using “C”.
A new quadrature formula has been proposed which uses modified weight functions derived from those of ‘Bernstein Polynomial’ using a ‘Two-Phase Modification’ therein. The quadrature formula has been compared empirically with the simple method of numerical integration using the well-known “Bernstein Operator”. The percentage absolute relative errors for the proposed quadrature formula and that with the “Bernstein Operator” have been computed for certain selected functions, with different number of usual equidistant node-points in the interval of integration~ [0, 1]. It has been observed that both of the proposed modified quadrature formulae, respectively after the ‘Phase-I’ and after the ‘Phases-I & II’ of these modifications, produce significantly better results than that using, simply, the “Bernstein Operator”. Inasmuch as the proposed “Two-Phase Improvement” is available iteratively again-and-again at the end of the current iteration, the proposed improvement algorithm, which is ‘Computerizable’, is an “Iterative-Algorithm”, leading to more-and-more efficient “Quadrature-Operator”, till we are pleased!
Energy meter reading is a monotonous and an expensive task. Now the meter reader people goes to each meter and take the meter reading manually to issue the bill which will later be entered in the billing software for billing and payment automation. If the manual meter reading and bill data entry process can be automated then it would reduced the laborious task and financial wastage. "Automatic Electric Meter Reading (AMR) System" is a metering system that is to be used for data collecting from the meter and processing the collected data for billing and other decision purposes. In this paper we have proposed an automatic meter reading system which is low cost, high performance, highest data rate, highest coverage area and most appropriate for Bangladesh perspective. In this AMR system there are four basic units. They are reading unit, communication unit, data receiving and processing unit and billing system. For reading unit we identified the disk rotation of the energy meter and stored the data in microcontroller. So it is not required to change the current analog energy meter. An external module will be added with the current energy meter. In the communication unit Wimax transceiver was used for wireless communication between meter end and the server end because of its wide coverage area. In the data receiving and processing unit meter reading will be collected from the transceiver which is controlled by another microcontroller. There will be a computer application that will take the data from the microcontroller. This will also help to avoid any tampering or break down of energy meter. There are various AMR system exists all over the world. Those systems were analyzed and we found they are not feasible for Bangladesh.
Biogas productions of cattle manure using rumen fluid inoculums were determined using batch anaerobic digesters at mesophilic temperatures (room and 38.5 oC). The aim of this paper was to analyze the influence of rumen fluid contents on biogas yield from cattle manure using fluid rumen inoculums. A series of laboratory experiments using 400 ml biodigester were performed in batch operation mode. Given 100 grams of fresh cattle manure (M) was fed to each biodigester and mixed with rumen fluid (R) and tap water (W) in several ratio resulting six different M:W:R ratio contents i.e. 1:1:0; 1:0.75:0.25; 1:0.5:0.5; 1:0.25:0.75; and 1:0:1 (correspond to 0; 12.5; 25, 37.5; 50, and 100 % rumen, respectively). The research showed that, either in room temperature as well as in 38.5 C, the best performance of biogas production was obtained with rumen fluid in the range of 25-50 %. Increasing rumen content will also increase biogas production. This is suggest that, due to the optimum total solid (TS) content for biogas production between 7-9 % (or correspond to more and less manure and total liquid 1:1), the rumen fluid content of 50 % will give the best performance for biogas production. However, intensively research need to be carried in further research to study interaction effect of TS and rumen content to biogas production.
Block Diagram of Convolutional Encoder  
State Diagram for convolutional encoder
Capacity as a function of Rake Fingers (M) 5.4 . CAPACITY AS A FUNCTION OF DIRECTIVITY OF BASE STATION ANTENNA ( a D ) – In fig.6, at BER value of 2 10  there are 7 users with directivity of base station antenna 4 dB, 11 users with directivity of base station antenna 6 dB and 17 users with directivity of base station antenna 8 dB. Thus the capacity of communication system with Rake receiver will increase with increasing the value of directivity of base station antenna.  
Capacity as a function of antenna directivity ( a D ) .
One of the biggest draw back of wireless environment is the limited bandwidth. However, the users sharing this limited bandwidth have been increased considerably by using SDMA technique that can enhancethe capacity of communication system. There are some techniques that can increase the capacity of the cellular system, these are- Spreading Technique, Error Control Coding Technique, Multipath Diversity Technique ( i.e. Rake Receiver), Smart Antenna Technique. In this paper we have used all these technique and examined how thecapacity of cellular system vary with varying the different parameters such as- the value of spreading factor, the number of Rake fingers, the number of interfering cells, value of directivity of Adaptive Antenna at base station. In the results we find that the capacity of a cellular system is varying with these parameters.
Engine specifications
Engine operating conditions
Properties of European diesel
This research focused on the effects cone angle using split injection on combustion in a direct injection diesel engine. Simulation has been carried on two different cone angles 150and 70degrees. Duration of injection, starting of injection, dwell, and spray cone angle and nozzle hole diameter are some of the important parameters which improves the engine performance consequently pollution levels getsdecreased. As far as cone angle is concerned the cone angle should be optimum and nozzle diameter should be optimized so as to achieve maximum efficiency for an optimum pressure. Even though EGR plays amajor role in reducing pollution levels to some extent in terms of NOx and soot, in this work EGR has been considered as zero percentage because as EGR increases particulate matter. It also dilutes the freshcharge which has been taken inside the engine. Split injection takes care of reducing particulate matter without increase in NOx levels .In this research simulation for two different cone angles have showngood agreement with experimental results.
Water interaction with tyre 
Skid resistance/ temperature correction relationship 
Skid resistance is the force developed when a tyre is prevented from rotating along the pavement surface. Skid resistance is thought of as a pavement property, it is the antonym of slipperiness. Among other road surface conditions, slippery pavement during precipitation is of great concern to road safety authorities. Some statistics indicate that the number of accidents increases by up to two folds during rainy conditions. Loss of skid resistance affects driver's ability to control vehicle. In addition to increasing the stopping distance while braking, lower skid resistance reduces steering controllability since both braking and steering depend on tire-pavement friction. In this paper mainly the methods of proper laying of pavements, various materials used to improver skid resistance, measurement of anti skid values such as PSV etc, Discussed .
Mobile Adhoc Network (MANET) consists of a collection of wireless mobile hosts without the required intervention of any existing infrastructure or centralized access point such as base station. The dynamic topology of MANET allows nodes to join and leave the network at any point of time. Wireless MANET is particularly vulnerabledue to its fundamental characteristics such as open medium, dynamic topology, distributed cooperation and constrained capability. So security in MANET is a complex issue. There are many routing protocols that establish the routes between the nodes in the network. The control towards the management of the nodes in the MANET is distributed. This features does not give assurance towards the security aspects of the network. There are many routing attacks caused due to lack of security. In this paper, therefore, we attempt to focus on analyzing and improving the security of one of the popularrouting protocol for MANET viz. the Adhoc On Demand Distance Vector (AODV) routing protocol. Our focus specifically, is on ensuring the security against the Blackhole Attack. The proposed solution is that capable of detecting and removing black hole nodes in the MANET at the initial stage itself without any delay.
Ridging process, assault sculpts, and identifier focused on user m.  
Decision threshold ι and norm r of din vector E.  
The effect of the din distribution on theerror probability of the detection test is studiedhere that when a class of randomly rotated spherical Ridges is used. The detection test is performed by a focused correlation detector, andthe spherical codes studied here form a randomized orthogonal constellation. The colluders create a din-free forgery by uniform averaging of their individual copies, and then add a din sequence to form the actual forgery. We derive the din distribution that aximizes the error probability of the detector under average andalmost-sure distortion constraints. Moreover, we characterize the din distribution that minimizes the decoder’s error exponent under a large-deviations distortion constraint. Our Ridges form a randomized orthogonal code, where the randomization parameters are a rotation. The dinless forgery is obtained by uniform linear averaging of he colluders copies. The detector has access to the host signal and performs a binary hypothesis test to verify whether a user of interestis colluding.
This paper aims at constructing a two-phase iterative computerizable numerical algorithm for an improved approximation by ‘Modified Lupas’operator. The algorithm uses the ‘statistical perspectives’ for exploiting the information about the unknown function ‘f’ available in terms of its known values at the ‘equidistant-knots’ in C[0,1] more fully. The improvement, achieved by an aposteriori use of this information, happens iteratively. Any typical iteration uses the concepts of ‘Mean Square Error (MSE)’ and ‘Bias’ ; the application of the former being preceded by that of the latter in the algorithm.At any iteration, the statistical concept of ‘MSE’ is used in “Phase II”, after that of the ‘Bias’ in “Phase I”. Like a ‘Sandwich’, the top and bottom-breads are the operations of ‘Bias-Reduction’ per the “Phase I” of our algorithm, and the operation of ‘MSEReduction’per the “Phase II” is the stuffing in the sandwich. The algorithm is an iterative one amounting to a desired-height ‘Docked-Pile’ ofsandwiches with the bottom–bread of the first iteration serving as the top-bread for the seconditeration sandwich, and so-on-and-so forth. The potential of the achievable improvements through the proposed ‘computerizable numerical iterative algorithm’ is illustrated per an ‘empirical study’ for which the function ‘f’ is assumed to be known in the sense of simulation. The illustration has been confined to “Three Iterations” only, for the sake of simplicity of the illustration.
Location and Base Map of Sungai Kayu Ara 
Input HEC-HMS Data (a) and Location of Rainfall and Water 
Flood Extent and Water Depth Distribution Generated by HECRAS for Events with 20 (a), 50 (b) and 100 (c) year ARI in Existing Development Condition in Sungai Kayu
Flow Velocity Distribution Generated by HEC-RAS for Event with 
Formulas of the Lines between Low, Medium and High Hazard Categories [51] 
In the past decades, thousands of lives have been lost, directly or indirectly, by flooding. In fact, of all natural hazards, floods pose the most widely distributed natural hazard to life today. Sungai Kayu Ara river basin which is located in the west part of the Kuala Lumpur in Malaysia was the case study of this research. In order to perform river flood hazard mapping HEC-HMS and HEC-RAS were utilized as hydrologic and hydraulic models, respectively. The generated river flood hazard was based on water depth and flow velocity maps whichwere prepared according to hydraulic model results in GIS environment. The results show that, magnitude of rainfall event (ARI) and river basin land-use development condition have significant influences on the river flood hazard maps pattern. Moreover, magnitude of rainfall event caused more influences on the river flood hazard map in comparison with land-use development condition for Sungai Kayu Ara river basin.
Web Service Security Architecture 
Initial Page of the mygoogle_search.html 
Encryption of data using JavaScript (MD5 algorithm) 
System Security Architecture from a software engineering viewpoint imposes that strong security must be a guiding principle of the entire software development process. It describes a way to weave security into systems architecture, and it identifies common patterns of implementation found in most security products. The security and software engineering communities must find ways to develop software correctly in a timely and cost-effective fashion. There’s no substitute for working software security as deeply into the evelopment process as possible. System designers and developers must take a more proactive role in building secure software. The root of most security problems is software that fails in unexpected ways whenunder attack. The enforcement of security at the design phase canreduce the cost and effort associated with the introduction of security during implementation. At the architecture level a systemmust be coherent and present unified security architecture that takes into account security principles (such as the least privilege). In this paper we want to discuss about different facets of security as applicable to Service Oriented Architectures (SOA) Security Architecture implementations. First we examine the securityrequirements and its solution mechanisms. In the context of WebServices, the predominant SOA implementation standard has a crucial role to play. The Web Services architecture is expected to play a prominent role in developing next generation distributed systems. Building dependable systems based on web services architecture is a major research issue being discussed. Finally, we provide a case study of Web Services Security Architecture, enhancing its security pertaining to Web 2.0 AJAX (Asynchronous JavaScript and XML) and its Security encryption of data using MD5algorithm.
Dynamic domino logic circuits are widely used in modern digital VLSI circuits. These dynamic circuits are often favoured in high performance designs because of the speed advantage offered over static CMOS logic circuits. The main drawbacks of dynamic logic are a lack of design automation, a decreased tolerance to noise and increased power dissipation. However, domino gates typically consume higher dynamic switching and leakage power and display weaker noise immunity as compared to static CMOS logic circuits. In this work, a new low voltage swing circuit technique based on a dual threshold voltage CMOS technology is presented for simultaneously reducing active & standby mode power consumption and enhancing evaluation speed and noise immunity in domino logic circuits in 65 nm deep submicron technology (DSM). The proposed technique modifies both the upper and lower boundaries of the voltage swing at the dynamic node. Ground, power supply and threshold voltages are simultaneously optimized to minimize the power delay product (PDP). The proposed techniques are compared by performing detailed transistor simulations on benchmark circuits such as 1-bit Half Adder, 16-bit Adder, 16-bit Comparator, D-Latch, 4-bit LFSR using Microwind 3 and DSCH3 CMOS layout CAD tools.
The scientific method has been traditionally used to describe and analyze reality. However the complete reality has been postulated to consist of three worlds: physical, mental and mathematical. Theconscious response to music and arts is correlated with EEG patterns. The EEG patterns and other synchronous neural activity may be a result of the nconsciousness field generated in the brain due to quantum computation in the microtubules. Furthermore, the microtubules themselves may be arranged in the form of Quantum Cellular automata or Quantum Hopfield Networks. The continuity ofQuantum Cellular Automata or Quantum Hopfield networks may be maintained by many structures including the gap junctions. Since the conscious response to arts and music is correlated with EEG patterns which may in turn be related with quantum computation, these responses are a result of activity in the real world. Hence these responses must complement scientific description and analysis of reality in order to arrive at a comprehensive picture.
This research was carried out to study the characteristics of bamboo leaf ash stabilization on lateritic soil in highway construction. Preliminary tests were performed on three samples, A, B, and C foridentification and classification purposes followed by the consistency limit tests. Geotechnical property tests (compaction, California bearing ratio (CBR), and triaxial) were also performed on the samples, both at the stabilized and unstabilized states by adding 2, 4, 6, 8 and 10% bamboo leaf ash (BLA) by weight of sample tothe soils. The results showed that the addition of BLA improved the strengths of the samples. Optimum moisture contents reduced to 20.20, 19.60 and 9.32% at 8, 4 and 6% BLA additions in samples A, B and Crespectively while MDD increased to 1400, 1676 and 1941 kg/m3 respectively at 8, 2 and 4% BLA additions in samples A, B, and C. The unsoaked CBR values of samples A and B increased from 5.44 to 38.21% and from 11.42 to 34.99% respectively. The shear strengths of samples A and B also increased from 181.31 to 199.00 kN/m2 and from 144.81 to 155.90 kN/m2 respectively. It was therefore concluded that bamboo leaf ash has a good potential for stabilizing lateritic soils in highway construction.
Several billion tons of fillers and reinforcements are used annually in the plastics industry, and there is a huge potential market for recyclable, energy efficient and more environmentally friendly composite materials. The use of medium strength and low strength fiber available in nature are having enough potential for other application where high strength are not critical but it can provide a feasible range of alternative materials to suitable conventional material. The systematic experimental study using developed mould-punch set up and testing aids was carried out for the effect of volume fraction of reinforcement on longitudinal elastic modulus of unidirectional cotton fiber reinforced polyester composites. The testing was carried out as per ASTM D3039/D3039M-08. The micro mechanics assessment of obtained experimental results with models available in literature for longitudinal elastic modulus forms an equally important constituent of present work.
Hard cake formation
Conveyer Belt
Output of Tech. Audit
This paper is aimed for finding the Quality and Profitability Improvement by Technical Audit, through a case study and further establishing the relationship between the product quality, profitability and technical audit. Quality audit generates the report of non conformance which basically represents the deviation from committed quality of products, or in short, it may be called as postmortem of product quality. By virtue of quality audit, the commitment, implementation and follow up for product quality are aligned. This delivers a good quality of product to the customers and thus the customer is benefited. In industries, Quality Inspectors are giving their decision for quality of product in two categories, "ACCEPTED" or "REJECTED". The accepted products are coming to the customers and the rejected products become the burden / problem to the manufacturers.If accepted product quantity is within the “NORMS”, no one cares regarding the rejected product quantities, what so ever. When the rejected product quantity increases beyond the “NORMS”, the analysis process starts to find out the reasons of rejections. Sometimes, it becomes too late to search out the reasons of rejections and survival of the industry becomes a problem. By technical audit and audit report implementation such type of conditions can be avoided and controlled.Basically, Quality is the function of Man, Machine, Materials, Methods, Movement, Manufacturing Processes, Monitoring and Management (8 M’s). If the technicality of 8 M’s is corrected by Technical Audit, the product quality will improve automatically and the profitability of the organization will improve. In short it can be solicited that if 8 M’s are all right, the product quality and profitability will automatically be set right. This may become an important aspect in the scenario of Indian Industries. The findings are supported by a case study of a Process Plant (Slag Dryer) of a reputed Indian Industry.
Bifurcation of logistic map 
Logistic map key distribution 
M-Logistic Keys (IDL Output) 
A major concern nowadays for any Biometric Credential Management System is its potential vulnerability to protect its information sources; i.e. protecting a genuine user’s template from both internal and external threats. These days’ biometric authentication systems face various risks. One of the most serious threats is the ulnerability of the template's database. An attacker with access to a reference template could try to impersonate a legitimate user by reconstructing the biometric sample and by creating a physical spoof.Susceptibility of the database can have a disastrous impact on the whole authentication system. The potential disclosure of digitally stored biometric data raises serious concerns about privacy and data protection. Therefore, we propose a method which would integrate conventional cryptography techniques with biometrics. In this work, we present a biometric crypto system which encrypts the biometric template and the encryption is done by generating pseudo random numbers, based on non-linear dynamics.
Block diagram of speech signal and palmprint multimodal biometric system  
Componets of a speaker recognition system
Mel filter bank
Extraction of MFCC and LFCC parameter There are several analytic formulae for the Mel scale used to compute the center frequencies fc (m). In this study we use the following common mapping:
Unimodal vs Multimodal
Biometrics based personal identification is regarded as an effective method for automatically recognizing, with a high confidence, a person’s identity. This paper proposes the multimodal biometrics system for identity verification using two traits, i.e., speech signal and palmprint. The proposed system is designed for pplications where the training data contains a speech signal and palmprint. It is well known that the performance of person uthentication using only speech signal or palmprint is deteriorated by feature changes with time. Integrating the palmprint and speech information increases robustness of person authentication. The final decision is made by fusion at matching score level architecture in which feature vectors are created independently for query measures and are then compared to the enrolment templates, which are storedduring database preparation. Multimodal system is developed through fusion of speech signal and palmprint recognition.
This paper concerns the study of the design and realization of a storage tank for raw palm oil. The main objective is to solve the problems of supply shortages of raw palm oil to stock-out to small and Medium Size Enterprises in Cameroon. We first of all look at the state-of-the-art of the design and realization of storage tanks, and then, and in the case of raw palm oil storage, we design an open cylindrical vertical tank with a conical roof supported by a ramework. We then proceed with the sizing of all the structural elements based on specially chosen mechanical criteria. We put in place software, DimCuve, to be used for the automatic sizing of storage tanks. The interest of this work is conclusive for enterprises of this sector.
Automotive drive Shaft is a very important components of vehicle. The present paper focuses on the design of such an automotive drive shat by composite materials. Now a days two pieces steel shaft are used as drive shaft. However, the main advantages of the present design is;only one piece of composite drive shat is possible that fulfil allthe requirements of drive shaft. Two different designs are proposed, one is purely from Graphite/Epoxy lamina and other is using Aluminum with Graphite/Epoxy. The basic requirements considered here are torsional strength, torsional buckling and bending natural frequency. An optimum design of the draft shaft is done, which is cheapest and lightest but meets all of the above load requirements. Progressive failure analysis of the selected design is also done.
Consistent data availability and query processing arethe two major issues to be considered in order to successfullydesign and implement scalable and cost efficient peer to peerdatabase systems. These aims in turn depend on how we deal withthe inherent dynamics of peer to peer systems. Though replicationhas been considered, traditionally as a solution to provide resilientdata availability and query processing in these types of systems, butthe cost efficiency of this measure is not considered to be optimal.In this paper, we propose a novel, more cost efficient method ofproviding data availability and query processing where we employDistributed hash Tables (DHTs) algorithm which flattens thestructure of the peer to peer database system optimally to satisfythese needs thereby eliminating the need to use replication.
Cluster computing are the best category of number of off-the-shelf commodity computers and resources that are integrated through hardware, networks and software to behave as a single computer simultaneously. In parallel applications, some processes are in need of executing simultaneously. We cannot be sure that all the processes are independent due to its communication behavior of some processes. Many of the processes are in need of co-scheduling each other. There are various types of co-scheduling available. This paper will focus mainly on the bandwidth and the memory concept mainly. This paper demands for the efficient resource utilization of cluster resources under the parallel execution of jobs using the newer bandwidth-aware co-scheduling concept which is put forth here.
Specification of proposed PMLOM
PMLOM Design Parameters
Several well-known analytical techniques exist for the force profile analysis of permanent-magnet linear oscillating motors (PMLOMs). These techniques, however, make significant simplifications in order to obtain the magnetic field distribution in the air gap. From the field distribution, the force profile can be found. These widelyused techniques provide a reasonable approximation for force profile analysis, but fail to give really accurate results in the sense of the exact shape of the force profile caused by effects that due to simplification are not fully included. To obtain the exact shape for the force profile in these cases, the computationally expensive finite-element method (FEM) is often applied. In this from the resulting field distribution, the force profile is calculated by means of the Maxwell stress tensor. The objective of this paper is to determine the forces for aluminium mover embedded with Nd-Fe-B Rare Earth Permanent Magnet experimentally and analytically through FEMLAB6.2 WITH MATHWORKS software and develop microcontroller based IGBT Inverter for its control. In this paper Development, Finite Element Analysis of Magnetic field distribution, performance , control and Testing of a New axial flux permanent magnet linear oscillating motor (PMLOM) along with a suitable speed and thrust control technique is described.
Geometry of the problem 
The problem of reflection of Love waves at a rigid barrier is studied in this paper by taking the barriers of different sizes. The barrier is present in the homogeneous, isotropic and slightly dissipative surface layer. The reflected waves are obtained by Wiener – Hopf technique and Fourier transformations. Numerical computation has been done and conclusion has been drawn from the graphs of amplitudes versus wave number of the reflected Love waves. The amplitude of the reflected waves decreases rapidly with the increase in wave number and then it decreases at slower rate and ultimately becomes saturated which shows that love waves take a long time to dissipate and go on moving around earth surface for a long time. The comparison of graphs also shows that the barriers of large sizes result in reflected Love waves with larger amplitudes.
Current Park’s vector for ideal condition. 
Experimental set up 
Current park’ vector pattern for a healthy motor 
Current Park vector’s pattern for faulty bearing with inner race fault 
Current Park vector’s pattern for faulty bearing with outer race fault 
The reliability of an induction motor is of paramount importance in industrial, commercial, aerospace and military applications. Bearing play an important role in the reliability and performance of all motor systems. Due to close relationship between motor system development and bearing assembly performance, it is difficult to imagine the progress of modern rotating machinery without consideration of the wide application of bearing. Most faults arising in motors are often linked to bearing faults. This paper presents an experimental study to diagnose the bearing fault with help of Park’s vector approach. The experiment is conducted on 0.5 hp three phaseinduction motor. The bearing faults are replicated in the laboratory by drilling the outer and inner race of ball bearing with help of electric discharge machining. The LabVIEW software is used in the experiment to acquire the signal. The acquired signal is analyzed with Park vector approach. The current Park’s vector presentation is generated by programming in LabVIEW. The practical results show that Park’s Vector approach is an effective technique to diagnose the bearing fault at early stage.
Many urban centers of the country are located on the coastal tract apart from thousands of villages and industrial settlements. Water resources in coastal areas assume a special significance since anydevelopmental activity will largely depend upon availability of fresh water to meet domestic, industrial and agricultural requirements.Thisincreases the dependency upon groundwater for meeting the freshwater demand. As the region is close to the coast, the variations in the levels of water table due to excess withdrawals from wells and bore wells will cause the intrusion of seawater into the groundwater. In the present paper deals with the study of saltwater intrusion in the coastal tract of Srikakulam district, on an areal basis. From the results obtained the variation in the effect of contamination with respect to distance from shore is studied and a comparison of the contamination in open wells and bore wells is also carried out.
Face detection result  
ROC curve  
Face Detection plays a major role in Biometrics.Feature selection is a problem of formidable complexity. Thispaper proposes a novel approach to extract face features forface detection. The LBP features can be extracted faster in asingle scan through the raw image and lie in a lower dimensional space, whilst still retaining facial information efficiently. The LBP features are robust to low-resolution images. The dominant local binary pattern (DLBP) is used to extract features accurately. A number of trainable methods are emerging in the empirical practice due to their effectiveness. The proposed method is a trainable system for selecting face features from over-completes dictionaries of imagemeasurements. After the feature selection procedure is completed the SVM classifier is used for face detection. The main advantage of this proposal is that it is trained on a very small training set. The classifier is used to increase the selection accuracy. This is not only advantageous to facilitate the datagathering stage, but, more importantly, to limit the training time. CBCL frontal faces dataset is used for training and validation.
The rapid depletion in world petroleum reserves and uncertainty in petroleum supply due to political and economical reasons, as well as, the sharp escalations in the petroleum prices have stimulated the search for alternatives to petroleum fuels. The situation is very grave in developing countries like India which imports 70% of the required fuel, spending 30% of her total foreign exchange earnings on oil imports. Petroleum fuels are being consumed by agriculture and transport sector for which diesel engine happens to be the prime mover. Diesel fuelled vehicles discharge significant amount of pollutants like CO, HC, NOx, soot, lead compounds which are harmful to the universe. Though there are wide varieties of alternative fuels available, the research has not yet provided the right renewable fuel to replace diesel. Vegetable oils due to their properties being close to diesel fuel may be a promising alternative for its use in diesel engines. The high viscosity and low volatility are the major drawbacks of the use of vegetable oils in diesel engines. India is the second largest cotton producing country in the world today. The cotton seeds are available in India at cheaper price. Experiments were conducted on 5.2 BHP single cylinder four stroke water-cooled variable compression diesel engine. Methyl ester of cottonseed oil is blended with the commercially available Xtramile diesel. Cottonseed oil methyl ester (CSOME) is blended in four different compositions varying from 10% to 40% in steps of 10 vol%. Using these four blends and Xtramile diesel brake thermal efficiency (BTE) and brake specific fuel consumption (BSFC) are determined at 17.5 compression ratio.
The relationship between operating pressure and BOG
Relationship between BOG and percentage of Methane
This paper focuses on the effect of pressure and heat leakages on Boil-off Gas (BOG) in Liquefied Natural Gas (LNG) tanks. The Lee-Kesler-Plocker (LKP) and the Starling modified Benedict-Webb-Rubin (BWRS) empirical models were used to simulate the compressibility factor, enthalpy and hence heat leakage at various pressures to determine the factors that affect the BOG in typical LNG tanks of different capacities. Using a case study data the heat leakage of 140,000kl, 160,00kl, 180,000kl and 200,000kl LNG tanks were analyzed using the LKP and BWRS models. The heat leakage of LNG tanks depends on the structure of tanks, and the small tanks lose heatto the environment due to their large surface area to volume ratio. As the operation pressure was dropped to 200mbar, all four of the LNG tanks’ BOG levels reached 0.05vol%/day. In order to satisfy the BOG design requirement, the operating pressure of the four large LNG tanks in the case study was maintained above 200mbar. Thus, the operating pressure impacts BOG on LNG tanks, but this effect is limited under the extreme high operation pressure. An attempt was made to determine the relationship between the compositions of LNGand BOG; one been combustible and the other non-combustible gases. The main component of combustible gas was methane, and nitrogen was of non-combustible gases. The relationship between BOG and methane compositions was that, as the methane fraction increases in the LNG, the BOG volume also increases. In general, results showed a direct correlation between BOG and operating pressure. The study also found that larger LNG tanks have less BOG; however as the operation pressure is increased the differences in the quantity of BOGamong the four tanks decreased.
Sample network 
The distributed systems in which nodes and/or edges may fail with certain probabilities have been modelled by a probabilistic network or a graph G. Computing the residual connectedness reliability (RCR), denoted by R(G), of probabilistic networks under the fault model with both node and edge faults is very useful, but is an NP-hard problem. Since it may need exponential time of the network size to compute theexact value of R(G), it is important to calculate its tight approximate value. In this paper, we present a new approach with an efficient algorithm for evaluating the upper bound of R(G) of distributed systems with unreliable nodes and edges. We also apply our algorithm to some typical classes of networks to evaluate the upper bounds and show the effectiveness and the efficiency of the new algorithm. Numerical results are presented.
Flowchart of FCPMA  
Project network  
Network matrix  
Correct scheduling of the project is the necessary condition for the project success. In traditional models, the activities duration times are deterministic and known. In real world however, accurate calculation of time for performing each activity is not possible and is always faced with uncertainty. In this paper, the duration of each activity is estimated by the experts as linguistic variables and the said variables are represented in fuzzy numbers form using the fuzzy theory. Estimating the project accomplishment duration and determining the project critical path will be possible throughresolving a fuzzy linear programming model. For solving model,Fuzzy Critical Path Method Algorithm (FCPMA) is introduced that uses fuzzy numbers ranking. In none of this method steps the defuzzification of the fuzzy numbers occurs, and the project accomplishment duration is gained in trapezoidal fuzzy number.Finally the performance of the introduced algorithm is shownusing an application example.
Variation of Output power and Frequency with Cap diameter (Cavity tuning) 
Variation of output power and Frequency with Cap height (Cavity tuning) For any given frequency, the dominant mode has 1   m n and it corresponds to the minimum radius of the disc for resonance. The root of 0 ) (   kr J n for dominant 
A coherent study of tuning properties of resonantcap IMPATT oscillator at Ka-band has been carried out by mechanical and electronic means. It is experimentally observed that output power of IMPATT oscillator passes through a maximum, with an optimum combination of cap diameter and cap height. An empirical relation is obtained between cap diameter and wavelength of the optimised resonant cap oscillator, which agrees well with theoretical relation. Electronic tuning is carried out by varying the dc bias current with the optimized cavity parameters. It is observed that the variations ofoscillation frequency and power output with dc bias current are characterised by three ranges of bias current. The effect of sliding short tuner position on the performance of the oscillator at optimised condition has also been studied. Finally, the injection locking of the free running oscillator has been studied with a Ka-band signal generator as a reference source. The injection locked IMPATT oscillator shows a good phase noise performance.
Top-cited authors
Andino Maseleno
Miftachul Huda
  • Universiti Pendidikan Sultan Idris (UPSI)
Kamarul Azmi Jasmi
  • Universiti Teknologi Malaysia
Bushrah Basiron
  • Universiti Teknologi Malaysia
Kamrul Alam Khan
  • Jagannath University - Bangladesh