Search results for: hybrid energy storage
1532 Analysis of Performance of 3T1D Dynamic Random-Access Memory Cell
Authors: Nawang Chhunid, Gagnesh Kumar
Abstract:
On-chip memories consume a significant portion of the overall die space and power in modern microprocessors. On-chip caches depend on Static Random-Access Memory (SRAM) cells and scaling of technology occurring as per Moore’s law. Unfortunately, the scaling is affecting stability, performance, and leakage power which will become major problems for future SRAMs in aggressive nanoscale technologies due to increasing device mismatch and variations. 3T1D Dynamic Random-Access Memory (DRAM) cell is a non-destructive read DRAM cell with three transistors and a gated diode. In 3T1D DRAM cell gated diode (D1) acts as a storage device and also as an amplifier, which leads to fast read access. Due to its high tolerance to process variation, high density, and low cost of memory as compared to 6T SRAM cell, it is universally used by the advanced microprocessor for on chip data and program memory. In the present paper, it has been shown that 3T1D DRAM cell can perform better in terms of fast read access as compared to 6T, 4T, 3T SRAM cells, respectively.Keywords: DRAM cell, read access time, tanner EDA tool write access time and retention time, average power dissipation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13511531 Novel Hybrid Method for Gene Selection and Cancer Prediction
Authors: Liping Jing, Michael K. Ng, Tieyong Zeng
Abstract:
Microarray data profiles gene expression on a whole genome scale, therefore, it provides a good way to study associations between gene expression and occurrence or progression of cancer. More and more researchers realized that microarray data is helpful to predict cancer sample. However, the high dimension of gene expressions is much larger than the sample size, which makes this task very difficult. Therefore, how to identify the significant genes causing cancer becomes emergency and also a hot and hard research topic. Many feature selection algorithms have been proposed in the past focusing on improving cancer predictive accuracy at the expense of ignoring the correlations between the features. In this work, a novel framework (named by SGS) is presented for stable gene selection and efficient cancer prediction . The proposed framework first performs clustering algorithm to find the gene groups where genes in each group have higher correlation coefficient, and then selects the significant genes in each group with Bayesian Lasso and important gene groups with group Lasso, and finally builds prediction model based on the shrinkage gene space with efficient classification algorithm (such as, SVM, 1NN, Regression and etc.). Experiment results on real world data show that the proposed framework often outperforms the existing feature selection and prediction methods, say SAM, IG and Lasso-type prediction model.Keywords: Gene Selection, Cancer Prediction, Lasso, Clustering, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20501530 Frequency-Variation Based Method for Parameter Estimation of Transistor Amplifier
Authors: Akash Rathee, Harish Parthasarathy
Abstract:
In this paper, a frequency-variation based method has been proposed for transistor parameter estimation in a commonemitter transistor amplifier circuit. We design an algorithm to estimate the transistor parameters, based on noisy measurements of the output voltage when the input voltage is a sine wave of variable frequency and constant amplitude. The common emitter amplifier circuit has been modelled using the transistor Ebers-Moll equations and the perturbation technique has been used for separating the linear and nonlinear parts of the Ebers-Moll equations. This model of the amplifier has been used to determine the amplitude of the output sinusoid as a function of the frequency and the parameter vector. Then, applying the proposed method to the frequency components, the transistor parameters have been estimated. As compared to the conventional time-domain least squares method, the proposed method requires much less data storage and it results in more accurate parameter estimation, as it exploits the information in the time and frequency domain, simultaneously. The proposed method can be utilized for parameter estimation of an analog device in its operating range of frequencies, as it uses data collected from different frequencies output signals for parameter estimation.Keywords: Perturbation Technique, Parameter estimation, frequency-variation based method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17671529 An Agent Based Dynamic Resource Scheduling Model with FCFS-Job Grouping Strategy in Grid Computing
Authors: Raksha Sharma, Vishnu Kant Soni, Manoj Kumar Mishra, Prachet Bhuyan, Utpal Chandra Dey
Abstract:
Grid computing is a group of clusters connected over high-speed networks that involves coordinating and sharing computational power, data storage and network resources operating across dynamic and geographically dispersed locations. Resource management and job scheduling are critical tasks in grid computing. Resource selection becomes challenging due to heterogeneity and dynamic availability of resources. Job scheduling is a NP-complete problem and different heuristics may be used to reach an optimal or near optimal solution. This paper proposes a model for resource and job scheduling in dynamic grid environment. The main focus is to maximize the resource utilization and minimize processing time of jobs. Grid resource selection strategy is based on Max Heap Tree (MHT) that best suits for large scale application and root node of MHT is selected for job submission. Job grouping concept is used to maximize resource utilization for scheduling of jobs in grid computing. Proposed resource selection model and job grouping concept are used to enhance scalability, robustness, efficiency and load balancing ability of the grid.Keywords: Agent, Grid Computing, Job Grouping, Max Heap Tree (MHT), Resource Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21001528 Shifting Paradigms of Culture: Rise of Secular Sensibility in Indian Literature
Authors: Nidhi Chouhan
Abstract:
Burgeoning demand of ‘Secularism’ has shaken the pillars of cultural studies in the contemporary literature. The perplexity of the culturally estranged term ‘secular’ gives rise to temporal ideologies across the world. Hence, it is high time to scan this concept in the context of Indian lifestyle which is a blend of assimilated cultures woven in multiple religious fabrics. The infliction of such secular taste is depicted in literary productions like ‘Satanic Verses’ and ‘An Area of Darkness’. The paper conceptually makes a cross-cultural analysis of anti-religious Indian literary texts, assessing its revitalization in current times. Further, this paper studies the increasing popularity of secular sensibility in the contemporary times. The mushrooming elements of secularism such as abstraction, spirituality, liberation, individualism give rise to a seemingly newer idea i.e. ‘Plurality’ making the literature highly hybrid. This approach has been used to study Indian modernity reflected in its literature. Seminal works of stalwarts are used to understand the consequence of this cultural synthesis. Conclusively, this theoretical research inspects the efficiency of secular culture, intertwined with internal coherence and throws light on the plurality of texts in Indian literature.
Keywords: Culture, Indian, literature, plurality, religion, secular, secularism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9141527 Quad Tree Decomposition Based Analysis of Compressed Image Data Communication for Lossy and Lossless Using WSN
Authors: N. Muthukumaran, R. Ravi
Abstract:
The Quad Tree Decomposition based performance analysis of compressed image data communication for lossy and lossless through wireless sensor network is presented. Images have considerably higher storage requirement than text. While transmitting a multimedia content there is chance of the packets being dropped due to noise and interference. At the receiver end the packets that carry valuable information might be damaged or lost due to noise, interference and congestion. In order to avoid the valuable information from being dropped various retransmission schemes have been proposed. In this proposed scheme QTD is used. QTD is an image segmentation method that divides the image into homogeneous areas. In this proposed scheme involves analysis of parameters such as compression ratio, peak signal to noise ratio, mean square error, bits per pixel in compressed image and analysis of difficulties during data packet communication in Wireless Sensor Networks. By considering the above, this paper is to use the QTD to improve the compression ratio as well as visual quality and the algorithm in MATLAB 7.1 and NS2 Simulator software tool.
Keywords: Image compression, Compression Ratio, Quad tree decomposition, Wireless sensor networks, NS2 simulator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23981526 Analytical Prediction of Seismic Response of Steel Frames with Superelastic Shape Memory Alloy
Authors: Mohamed Omar
Abstract:
Superelastic Shape Memory Alloy (SMA) is accepted when it used as connection in steel structures. The seismic behaviour of steel frames with SMA is being assessed in this study. Three eightstorey steel frames with different SMA systems are suggested, the first one of which is braced with diagonal bracing system, the second one is braced with nee bracing system while the last one is which the SMA is used as connection at the plastic hinge regions of beams. Nonlinear time history analyses of steel frames with SMA subjected to two different ground motion records have been performed using Seismostruct software. To evaluate the efficiency of suggested systems, the dynamic responses of the frames were compared. From the comparison results, it can be concluded that using SMA element is an effective way to improve the dynamic response of structures subjected to earthquake excitations. Implementing the SMA braces can lead to a reduction in residual roof displacement. The shape memory alloy is effective in reducing the maximum displacement at the frame top and it provides a large elastic deformation range. SMA connections are very effective in dissipating energy and reducing the total input energy of the whole frame under severe seismic ground motion. Using of the SMA connection system is more effective in controlling the reaction forces at the base frame than other bracing systems. Using SMA as bracing is more effective in reducing the displacements. The efficiency of SMA is dependant on the input wave motions and the construction system as well.Keywords: Finite element analysis, seismic response, shapesmemory alloy, steel frame, superelasticity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18521525 A Model for Estimation of Efforts in Development of Software Systems
Authors: Parvinder S. Sandhu, Manisha Prashar, Pourush Bassi, Atul Bisht
Abstract:
Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.Keywords: Neuro-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model, GA Based Model, Genetic Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32381524 Surrogate based Evolutionary Algorithm for Design Optimization
Authors: Maumita Bhattacharya
Abstract:
Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.Keywords: Evolutionary algorithm, Fitness function, Optimization, Meta-model, Stochastic method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15851523 Neural Networks-Based Acoustic Annoyance Model for Laptop Hard Disk Drive
Authors: Yi Chao Ma, Cheng Siong Chin, Wai Lok Woo
Abstract:
Since the last decade, there has been a rapid growth in digital multimedia, such as high-resolution media files and threedimentional movies. Hence, there is a need for large digital storage such as Hard Disk Drive (HDD). As such, users expect to have a quieter HDD in their laptop. In this paper, a jury test has been conducted on a group of 34 people where 17 of them are students who are the potential consumer, and the remaining are engineers who know the HDD. A total 13 HDD sound samples have been selected from over hundred HDD noise recordings. These samples are selected based on an agreed subjective feeling. The samples are played to the participants using head acoustic playback system, which enabled them to experience as similar as possible the same environment as have been recorded. Analysis has been conducted and the obtained results have indicated different group has different perception over the noises. Two neural network-based acoustic annoyance models are established based on back propagation neural network. Four psychoacoustic metrics, loudness, sharpness, roughness and fluctuation strength, are used as the input of the model, and the subjective evaluation results are taken as the output. The developed models are reasonably accurate in simulating both training and test samples.Keywords: Hard disk drive noise, jury test, neural network model, psychoacoustic annoyance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15381522 Comprehensive Regional Drought Assessment Index
Authors: A. Zeynolabedin, M. A. Olyaei, B. Ghiasi
Abstract:
Drought is an inevitable part of the earth’s climate. It occurs regularly with no clear warning and without recognizing borders. In addition, its impact is cumulative and not immediately discernible. Iran is located in a semi-arid region where droughts occur periodically as natural hazard. Standardized Precipitation Index (SPI), Surface Water Supply Index (SWSI), and Palmer Drought Severity Index (PDSI) are three well-known indices which describe drought severity; each has its own advantages and disadvantages and can be used for specific types of drought. These indices take into account some factors such as precipitation, reservoir storage and discharge, temperature, and potential evapotranspiration in determining drought severity. In this paper, first all three indices are calculated in Aharchay river watershed located in northwestern part of Iran in East Azarbaijan province. Next, based on two other important parameters which are groundwater level and solar radiation, two new indices are defined. Finally, considering all five aforementioned indices, a combined drought index (CDI) is presented and calculated for the region. This combined index is based on all the meteorological, hydrological, and agricultural features of the region. The results show that the most severe drought condition in Aharchay watershed happened in Jun, 2004. The result of this study can be used for monitoring drought and prepare for the drought mitigation planning.Keywords: Drought, index variation, regional assessment, monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12701521 Characterization of InGaAsP/InP Quantum Well Lasers
Authors: K. Melouk, M. Dellakrachai
Abstract:
Analytical formula for the optical gain based on a simple parabolic-band by introducing theoretical expressions for the quantized energy is presented. The model used in this treatment take into account the effects of intraband relaxation. It is shown, as a result, that the gain for the TE mode is larger than that for TM mode and the presence of acceptor impurity increase the peak gain.Keywords: Laser, quantum well, semiconductor, InGaAsP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21571520 Single Phase 13-Level D-STATCOM Inverter with Distributed System
Authors: R. Kamalakannan, N. Ravi Kumar
Abstract:
The global energy consumption is increasing persistently and need for distributed power generation through renewable energy is essential. To meet the power requirements for consumers without any voltage fluctuations and losses, modeling and design of multilevel inverter with Flexible AC Transmission System (FACTS) capability is presented. The presented inverter is provided with 13-level cascaded H-bridge topology of Insulated Gate Bipolar Transistor (IGBTs) connected along with inbuilt Distributed Static Synchronous Compensators (DSTATCOM). The DSTATCOM device provides control of power factor stability at local feeder lines and the inverter eliminates Total Harmonic Distortion (THD). The 13-level inverter utilizes 52 switches of each H-bridge is fed with single DC sources separately and the Pulse Width Modulation (PWM) technique is used for switching IGBTs. The control strategy implemented for inverter transmits active power to grid as well as it maintains power factor to be stable with achievement of steady state power transmission. Significant outcome of this project is improvement of output voltage quality with steady state power transmission with low THD. Simulation of inverter with DSTATCOM is performed using MATLAB/Simulink environment. The scaled prototype model of proposed inverter is built and its results were validated with simulated results.Keywords: FACTS devices, distributed-Static synchronous compensators, DSTATCOM, total harmonics elimination, modular multilevel converter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14641519 Application of GIS-Based Construction Engineering: An Electronic Document Management System
Authors: Mansour N. Jadid
Abstract:
This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.
Keywords: Construction, coordinate, engineering, GIS, management, map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14611518 A TIPSO-SVM Expert System for Efficient Classification of TSTO Surrogates
Authors: Ali Sarosh, Dong Yun-Feng, Muhammad Umer
Abstract:
Fully reusable spaceplanes do not exist as yet. This implies that design-qualification for optimized highly-integrated forebody-inlet configuration of booster-stage vehicle cannot be based on archival data of other spaceplanes. Therefore, this paper proposes a novel TIPSO-SVM expert system methodology. A non-trivial problem related to optimization and classification of hypersonic forebody-inlet configuration in conjunction with mass-model of the two-stage-to-orbit (TSTO) vehicle is solved. The hybrid-heuristic machine learning methodology is based on two-step improved particle swarm optimizer (TIPSO) algorithm and two-step support vector machine (SVM) data classification method. The efficacy of method is tested by first evolving an optimal configuration for hypersonic compression system using TIPSO algorithm; thereafter, classifying the results using two-step SVM method. In the first step extensive but non-classified mass-model training data for multiple optimized configurations is segregated and pre-classified for learning of SVM algorithm. In second step the TIPSO optimized mass-model data is classified using the SVM classification. Results showed remarkable improvement in configuration and mass-model along with sizing parameters.
Keywords: TIPSO-SVM expert system, TIPSO algorithm, two-step SVM method, aerothermodynamics, mass-modeling, TSTO vehicle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23281517 Full Potential Study of Electronic and Optical Properties of NdF3
Authors: Sapan Mohan Saini
Abstract:
We report the electronic structure and optical properties of NdF3 compound. Our calculations are based on density functional theory (DFT) using the full potential linearized augmented plane wave (FPLAPW) method with the inclusion of spin orbit coupling. We employed the local spin density approximation (LSDA) and Coulomb-corrected local spin density approximation, known for treating the highly correlated 4f electrons properly, is able to reproduce the correct insulating ground state. We find that the standard LSDA approach is incapable of correctly describing the electronic properties of such materials since it positions the f-bands incorrectly resulting in an incorrect metallic ground state. On the other hand, LSDA + U approximation, known for treating the highly correlated 4f electrons properly, is able to reproduce the correct insulating ground state. Interestingly, however, we do not find any significant differences in the optical properties calculated using LSDA, and LSDA + U suggesting that the 4f electrons do not play a decisive role in the optical properties of these compounds. The reflectivity for NdF3 compound stays low till 7 eV which is consistent with their large energy gaps. The calculated energy gaps are in good agreement with experiments. Our calculated reflectivity compares well with the experimental data and the results are analyzed in the light of band to band transitions.Keywords: FPLAPW Method, optical properties, rare earthtrifluorides LSDA+U
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16821516 Tidal Data Analysis using ANN
Authors: Ritu Vijay, Rekha Govil
Abstract:
The design of a complete expansion that allows for compact representation of certain relevant classes of signals is a central problem in signal processing applications. Achieving such a representation means knowing the signal features for the purpose of denoising, classification, interpolation and forecasting. Multilayer Neural Networks are relatively a new class of techniques that are mathematically proven to approximate any continuous function arbitrarily well. Radial Basis Function Networks, which make use of Gaussian activation function, are also shown to be a universal approximator. In this age of ever-increasing digitization in the storage, processing, analysis and communication of information, there are numerous examples of applications where one needs to construct a continuously defined function or numerical algorithm to approximate, represent and reconstruct the given discrete data of a signal. Many a times one wishes to manipulate the data in a way that requires information not included explicitly in the data, which is done through interpolation and/or extrapolation. Tidal data are a very perfect example of time series and many statistical techniques have been applied for tidal data analysis and representation. ANN is recent addition to such techniques. In the present paper we describe the time series representation capabilities of a special type of ANN- Radial Basis Function networks and present the results of tidal data representation using RBF. Tidal data analysis & representation is one of the important requirements in marine science for forecasting.Keywords: ANN, RBF, Tidal Data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16661515 Large Scale Production of Polyhydroxyalkanoates (PHAs) from Wastewater: A Study of Techno-Economics, Energy Use and Greenhouse Gas Emissions
Authors: Cora Fernandez Dacosta, John A. Posada, Andrea Ramirez
Abstract:
The biodegradable family of polymers polyhydroxyalkanoates is an interesting substitute for convectional fossil-based plastics. However, the manufacturing and environmental impacts associated with their production via intracellular bacterial fermentation are strongly dependent on the raw material used and on energy consumption during the extraction process, limiting their potential for commercialization. Industrial wastewater is studied in this paper as a promising alternative feedstock for waste valorization. Based on results from laboratory and pilot-scale experiments, a conceptual process design, techno-economic analysis and life cycle assessment are developed for the large-scale production of the most common type of polyhydroxyalkanoate, polyhydroxbutyrate. Intracellular polyhydroxybutyrate is obtained via fermentation of microbial community present in industrial wastewater and the downstream processing is based on chemical digestion with surfactant and hypochlorite. The economic potential and environmental performance results help identifying bottlenecks and best opportunities to scale-up the process prior to industrial implementation. The outcome of this research indicates that the fermentation of wastewater towards PHB presents advantages compared to traditional PHAs production from sugars because the null environmental burdens and financial costs of the raw material in the bioplastic production process. Nevertheless, process optimization is still required to compete with the petrochemicals counterparts.
Keywords: Circular economy, life cycle assessment, polyhydroxyalkanoates, waste valorization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41911514 Formulation of Mortars with Marine Sediments
Authors: Nor-Edine Abriak, Mouhamadou Amar, Mahfoud Benzerzour
Abstract:
The transition to a more sustainable economy is directed by a reduction in the consumption of raw materials in equivalent production. The recovery of byproducts and especially the dredged sediment as mineral addition in cements matrix represents an alternative to reduce raw material consumption and construction sector’s carbon footprint. However, the efficient use of sediment requires adequate and optimal treatment. Several processing techniques have so far been applied in order to improve some physicochemical properties. The heat treatment by calcination was effective in removing the organic fraction and activates the pozzolanic properties. In this article, the effect of the optimized heat treatment of marine sediments in the physico-mechanical and environmental properties of mortars are shown. A finding is that the optimal substitution of a portion of cement by treated sediments by calcination at 750 °C helps to maintain or improve the mechanical properties of the cement matrix in comparison with a standard reference mortar. The use of calcined sediment enhances mortar behavior in terms of mechanical strength and durability. From an environmental point of view and life cycle, mortars formulated containing treated sediments are considered inert with respect to the inert waste storage facilities reference (ISDI-France).Keywords: Sediment, calcination, cement, reuse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9001513 Human Face Detection and Segmentation using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms
Authors: J. Prakash, K. Rajesh
Abstract:
In this paper we propose a novel method for human face segmentation using the elliptical structure of the human head. It makes use of the information present in the edge map of the image. In this approach we use the fact that the eigenvalues of covariance matrix represent the elliptical structure. The large and small eigenvalues of covariance matrix are associated with major and minor axial lengths of an ellipse. The other elliptical parameters are used to identify the centre and orientation of the face. Since an Elliptical Hough Transform requires 5D Hough Space, the Circular Hough Transform (CHT) is used to evaluate the elliptical parameters. Sparse matrix technique is used to perform CHT, as it squeeze zero elements, and have only a small number of non-zero elements, thereby having an advantage of less storage space and computational time. Neighborhood suppression scheme is used to identify the valid Hough peaks. The accurate position of the circumference pixels for occluded and distorted ellipses is identified using Bresenham-s Raster Scan Algorithm which uses the geometrical symmetry properties. This method does not require the evaluation of tangents for curvature contours, which are very sensitive to noise. The method has been evaluated on several images with different face orientations.Keywords: Circular Hough Transform, Covariance matrix, Eigenvalues, Elliptical Hough Transform, Face segmentation, Raster Scan Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25231512 Comparison and Characterization of Dyneema™ HB-210 and HB-212 for Accelerated UV Aging
Authors: Jonmichael A. Weaver, David A. Miller
Abstract:
Ultra High Molecular Weight Polyethylene (UHMWPE) presents several distinct advantages as a material with a high strength to weight ratio, durability, and neutron stability. Understanding the change in the mechanical performance of UHMWPE due to environmental exposure is key to safety for future applications. Dyneema® HB-210, a 15 µm diameter UHMWPE multi-filament fiber laid up in a polyurethane matrix in [0/ 90]2, with a thickness of 0.17 mm is compared to the same fiber and orientation system, HB-212, with a rubber-based matrix under UV aging conditions. UV aging tests according to ASTM-G154 were performed on both HB-210 and HB-212 to interrogate the change in mechanical properties, as measured through dynamic mechanical analysis and imaged using a scanning electron microscope. These results showed a decrease in both the storage modulus and loss modulus of the aged material compared to the unaged, even though the tan δ slightly increased. Material degradation occurred at a higher rate in Dyneema® HB-212 compared to HB-210. The HB-210 was characterized for the effects of 100 hours of UV aging via dynamic mechanical analysis. Scanning electron microscope images were taken of the HB-210 and HB-212 to identify the primary damage mechanisms in the matrix. Embrittlement and matrix spall were the products of prolonged UV exposure and erosion, resulting in decreased mechanical properties.
Keywords: Composite materials, material characterization, UV aging, UHMWPE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7181511 Simulation and Analysis of Control System for a Solar Desalination System
Authors: R. Prakash, B. Meenakshipriya, R. Kumaravelan
Abstract:
Fresh water is one of the resources which is getting depleted day by day. A wise method to address this issue is by the application of renewable energy-sun irradiation and by means of decentralized, cheap, energetically self-sufficient, robust and simple to operate plants, distillates can be obtained from sea, river or even sewage. Solar desalination is a technique used to desalinate water using solar energy. The present work deals with the comprehensive design and simulation of solar tracking system using LabVIEW, temperature and mass flow rate control of the solar desalination plant using LabVIEW and also analysis of single phase inverter circuit with LC filters for solar pumping system in MATLAB. The main objective of this work is to improve the performance of solar desalination system using automatic tracking system, output control using temperature and mass flow rate control system and also to reduce the harmonic distortion in the solar pumping system by means of LC filters. The simulation of single phase inverter was carried out using MATLAB and the output waveforms were analyzed. Simulations were performed for optimum output temperature control, which in turn controls the mass flow rate of water in the thermal collectors. Solar tracking system was accomplished using LABVIEW and was tested successfully. The thermal collectors are tracked in accordance with the sun’s irradiance levels, thereby increasing the efficiency of the thermal collectors.Keywords: Desalination, Electro dialysis, LabVIEW, MATLAB, PWM inverter, Reverse osmosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24071510 A Two-Step Approach for Tree-structured XPath Query Reduction
Authors: Minsoo Lee, Yun-mi Kim, Yoon-kyung Lee
Abstract:
XML data consists of a very flexible tree-structure which makes it difficult to support the storing and retrieving of XML data. The node numbering scheme is one of the most popular approaches to store XML in relational databases. Together with the node numbering storage scheme, structural joins can be used to efficiently process the hierarchical relationships in XML. However, in order to process a tree-structured XPath query containing several hierarchical relationships and conditional sentences on XML data, many structural joins need to be carried out, which results in a high query execution cost. This paper introduces mechanisms to reduce the XPath queries including branch nodes into a much more efficient form with less numbers of structural joins. A two step approach is proposed. The first step merges duplicate nodes in the tree-structured query and the second step divides the query into sub-queries, shortens the paths and then merges the sub-queries back together. The proposed approach can highly contribute to the efficient execution of XML queries. Experimental results show that the proposed scheme can reduce the query execution cost by up to an order of magnitude of the original execution cost.Keywords: XML, Xpath, tree-structured query, query reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15571509 Sedimentation and its Challenges for Operation and Maintenance of Hydraulic Structures using SHARC Software- A Case Study of Eastern Intake in Dez Diversion Dam in Iran
Authors: M.R. Mansoujian, N. Hedayat, M. Mashal, H, Kiamanesh
Abstract:
Analytical investigation of the sedimentation processes in the river engineering and hydraulic structures is of vital importance as this can affect water supply for the cultivating lands in the command area. The reason being that gradual sediment formation behind the reservoir can reduce the nominal capacity of these dams. The aim of the present paper is to analytically investigate sedimentation process along the river course and behind the storage reservoirs in general and the Eastern Intake of the Dez Diversion weir in particular using the SHARC software. Results of the model indicated the water level at 115.97m whereas the real time measurement from the river cross section was 115.98 m which suggests a significantly close relation between them. The average transported sediment load in the river was measured at 0.25mm , from which it can be concluded that nearly 100% of the suspended loads in river are moving which suggests no sediment settling but indicates that almost all sediment loads enters into the intake. It was further showed the average sediment diameter entering the intake to be 0.293 mm which in turn suggests that about 85% of suspended sediments in the river entre the intake. Comparison of the results from the SHARC model with those obtained form the SSIIM software suggests quite similar outputs but distinguishing the SHARC model as more appropriate for the analysis of simpler problems than other model.Keywords: SHARC, Eastern Intake, Dez Diversion Weir.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16081508 Static Analysis of Security Issues of the Python Packages Ecosystem
Authors: Adam Gorine, Faten Spondon
Abstract:
Python is considered the most popular programming language and offers its own ecosystem for archiving and maintaining open-source software packages. This system is called the Python Package Index (PyPI), the repository of this programming language. Unfortunately, one-third of these software packages have vulnerabilities that allow attackers to execute code automatically when a vulnerable or malicious package is installed. This paper contributes to large-scale empirical studies investigating security issues in the Python ecosystem by evaluating package vulnerabilities. These provide a series of implications that can help the security of software ecosystems by improving the process of discovering, fixing, and managing package vulnerabilities. The vulnerable dataset is generated using the NVD, the National Vulnerability Database, and the Snyk vulnerability dataset. In addition, we evaluated 807 vulnerability reports in the NVD and 3900 publicly known security vulnerabilities in Python Package Manager (Pip) from the Snyk database from 2002 to 2022. As a result, many Python vulnerabilities appear in high severity, followed by medium severity. The most problematic areas have been improper input validation and denial of service attacks. A hybrid scanning tool that combines the three scanners, Bandit, Snyk and Dlint, which provide a clear report of the code vulnerability, is also described.
Keywords: Python vulnerabilities, Bandit, Snyk, Dlint, Python Package Index, ecosystem, static analysis, malicious attacks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2731507 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model
Authors: Yuan-Jye Tseng, Shin-Han Lin
Abstract:
In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.
Keywords: Supply chain management, green supply chain, green design, green manufacturing, mathematical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18741506 Satellite Sensing for Evaluation of an Irrigation System in Cotton - Wheat Zone
Authors: Sadia Iqbal, Faheem Iqbal, Furqan Iqbal
Abstract:
Efficient utilization of existing water is a pressing need for Pakistan. Due to rising population, reduction in present storage capacity and poor delivery efficiency of 30 to 40% from canal. A study to evaluate an irrigation system in the cotton-wheat zone of Pakistan, after the watercourse lining was conducted. The study is made on the basis of cropping pattern and salinity to evaluate the system. This study employed an index-based approach of using Geographic information system with field data. The satellite images of different years were use to examine the effective area. Several combinations of the ratio of signals received in different spectral bands were used for development of this index. Near Infrared and Thermal IR spectral bands proved to be most effective as this combination helped easy detection of salt affected area and cropping pattern of the study area. Result showed that 9.97% area under salinity in 1992, 9.17% in 2000 and it left 2.29% in year 2005. Similarly in 1992, 45% area is under vegetation it improves to 56% and 65% in 2000 and 2005 respectively. On the basis of these results evaluation is done 30% performance is increase after the watercourse improvement.Keywords: Salinity, remote sensing index, salinity index, cropping pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16851505 Numerical Study of Cyclic Behavior of Shallow Foundations on Sand Reinforced with Geogrid and Grid-Anchor
Authors: Alireza Hajiani Boushehrian, Nader Hataf, Arsalan Ghahramani
Abstract:
When the foundations of structures under cyclic loading with amplitudes less than their permissible load, the concern exists often for the amount of uniform and non-uniform settlement of such structures. Storage tank foundations with numerous filling and discharging and railways ballast course under repeating transportation loads are examples of such conditions. This paper deals with the effects of using the new generation of reinforcements, Grid-Anchor, for the purpose of reducing the permanent settlement of these foundations under the influence of different proportions of the ultimate load. Other items such as the type and the number of reinforcements as well as the number of loading cycles are studied numerically. Numerical models were made using the Plaxis3D Tunnel finite element code. The results show that by using gridanchor and increasing the number of their layers in the same proportion as that of the cyclic load being applied, the amount of permanent settlement decreases up to 42% relative to unreinforced condition depends on the number of reinforcement layers and percent of applied load and the number of loading cycles to reach a constant value of dimensionless settlement decreases up to 20% relative to unreinforced condition.Keywords: Shallow foundation, Reinforced soil, Cyclic loading, Grid-Anchor, Numerical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25461504 Kinematic Analysis and Software Development of a Seven Degree of Freedom Inspection Robot
Authors: G. Shanmugasundar, R. Sivaramakrishnan, S. Venugopal
Abstract:
Robots are booming as an essential substituent in the field of inspection. In hazardous environments like nuclear waste disposal, robots are really a necessitate one. In a view to meet such demands, this paper presents the seven degree of freedom articulated inspection robot. To design such a robot the kinematic analysis of seven Degree of freedom robot which can inspect the hazardous nuclear waste storage tanks is done. The effective utilization of universal joints for arms and screw jack mechanisms at the base gives the higher order of degree of freedom to the newly designed robot. The analytical method of deriving the manipulator forward as well as inverse kinematics is explained elaborately using the Denavit-Hartenberg Approach for the purpose of calculating the robot joints, links and end-effector parameters. The comparison of the geometric and the analytical approach is stated. The self-developed kinematic model gives the accurate positions of the end effector. The Graphical User Interface (GUI) is developed in Visual Basic language for the manipulation of kinematic results easily. This software gives the expected position of the end-effector accurately at short time compared to manual manipulations.
Keywords: Robot kinematics, screw jack mechanisms, Denavit-Hartenberg approach, universal joints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29281503 A Study on Flammability of Bio Oil Combustible Vapour Mixtures
Authors: Mohanad El-Harbawi, Nurul Amirah Hanim Bt. Umar, Norizan Ali, Yoshimitsu Uemura
Abstract:
Study of fire and explosion is very important mainly in oil and gas industries due to several accidents which have been reported in the past and present. In this work, we have investigated the flammability of bio oil vapour mixtures. This mixture may contribute to fire during the storage and transportation process. Bio oil sample derived from Palm Kernell shell was analysed using Gas Chromatography Mass Spectrometry (GC-MS) to examine the composition of the sample. Mole fractions of 12 selected components in the liquid phase were obtained from the GC-FID data and used to calculate mole fractions of components in the gas phase via modified Raoult-s law. Lower Flammability Limits (LFLs) and Upper Flammability Limits (UFLs) for individual components were obtained from published literature. However, stoichiometric concentration method was used to calculate the flammability limits of some components which their flammability limit values are not available in the literature. The LFL and UFL values for the mixture were calculated using the Le Chatelier equation. The LFLmix and UFLmix values were used to construct a flammability diagram and subsequently used to determine the flammability of the mixture. The findings of this study can be used to propose suitable inherently safer method to prevent the flammable mixture from occurring and to minimizing the loss of properties, business, and life due to fire accidents in bio oil productions.Keywords: Gas chromatography, compositions, lower and upper flammability limits (LFL & UFL), flammability diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3441