Search results for: fuzzy techniques
5504 3D Biomechanical Analysis in Shot Put Techniques of International Throwers
Authors: Satpal Yadav, Ashish Phulkar, Krishna K. Sahu
Abstract:
Aim: The research aims at doing a 3 Dimension biomechanical analysis in the shot put techniques of International throwers to evaluate the performance. Research Method: The researcher adopted the descriptive method and the data was subjected to calculate by using Pearson’s product moment correlation for the correlation of the biomechanical parameters with the performance of shot put throw. In all the analyses, the 5% critical level (p ≤ 0.05) was considered to indicate statistical significance. Research Sample: Eight (N=08) international shot putters using rotational/glide technique in male category was selected as subjects for the study. The researcher used the following methods and tools to obtain reliable measurements the instrument which was used for the purpose of present study namely the tesscorn slow-motion camera, specialized motion analyzer software, 7.260 kg Shot Put (for a male shot-putter) and steel tape. All measurement pertaining to the biomechanical variables was taken by the principal investigator so that data collected for the present study was considered reliable. Results: The finding of the study showed that negative significant relationship between the angular velocity right shoulder, acceleration distance at pre flight (-0.70), (-0.72) respectively were obtained, the angular displacement of knee, angular velocity right shoulder and acceleration distance at flight (0.81), (0.75) and (0.71) respectively were obtained, the angular velocity right shoulder and acceleration distance at transition phase (0.77), (0.79) respectively were obtained and angular displacement of knee, angular velocity right shoulder, release velocity shot, angle of release, height of release, projected distance and measured distance as the values (0.76), (0.77), (-0.83), (-0.79), (-0.77), (0.99) and (1.00) were found higher than the tabulated value at 0.05 level of significance. On the other hand, there exists an insignificant relationship between the performance of shot put and acceleration distance [m], angular displacement shot, C.G at release and horizontal release distance on the technique of shot put.Keywords: biomechanics, analysis, shot put, international throwers
Procedia PDF Downloads 1895503 Real-Space Mapping of Surface Trap States in Cigse Nanocrystals Using 4D Electron Microscopy
Authors: Riya Bose, Ashok Bera, Manas R. Parida, Anirudhha Adhikari, Basamat S. Shaheen, Erkki Alarousu, Jingya Sun, Tom Wu, Osman M. Bakr, Omar F. Mohammed
Abstract:
This work reports visualization of charge carrier dynamics on the surface of copper indium gallium selenide (CIGSe) nanocrystals in real space and time using four-dimensional scanning ultrafast electron microscopy (4D S-UEM) and correlates it with the optoelectronic properties of the nanocrystals. The surface of the nanocrystals plays a key role in controlling their applicability for light emitting and light harvesting purposes. Typically for quaternary systems like CIGSe, which have many desirable attributes to be used for optoelectronic applications, relative abundance of surface trap states acting as non-radiative recombination centre for charge carriers remains as a major bottleneck preventing further advancements and commercial exploitation of these nanocrystals devices. Though ultrafast spectroscopic techniques allow determining the presence of picosecond carrier trapping channels, because of relative larger penetration depth of the laser beam, only information mainly from the bulk of the nanocrystals is obtained. Selective mapping of such ultrafast dynamical processes on the surfaces of nanocrystals remains as a key challenge, so far out of reach of purely optical probing time-resolved laser techniques. In S-UEM, the optical pulse generated from a femtosecond (fs) laser system is used to generate electron packets from the tip of the scanning electron microscope, instead of the continuous electron beam used in the conventional setup. This pulse is synchronized with another optical excitation pulse that initiates carrier dynamics in the sample. The principle of S-UEM is to detect the secondary electrons (SEs) generated in the sample, which is emitted from the first few nanometers of the top surface. Constructed at different time delays between the optical and electron pulses, these SE images give direct and precise information about the carrier dynamics on the surface of the material of interest. In this work, we report selective mapping of surface dynamics in real space and time of CIGSe nanocrystals applying 4D S-UEM. We show that the trap states can be considerably passivated by ZnS shelling of the nanocrystals, and the carrier dynamics can be significantly slowed down. We also compared and discussed the S-UEM kinetics with the carrier dynamics obtained from conventional ultrafast time-resolved techniques. Additionally, a direct effect of the state trap removal can be observed in the enhanced photoresponse of the nanocrystals after shelling. Direct observation of surface dynamics will not only provide a profound understanding of the photo-physical mechanisms on nanocrystals’ surfaces but also enable to unlock their full potential for light emitting and harvesting applications.Keywords: 4D scanning ultrafast microscopy, charge carrier dynamics, nanocrystals, optoelectronics, surface passivation, trap states
Procedia PDF Downloads 2955502 Quality Evaluation of Grape Seed Oils of the Ionian Islands Based on GC-MS and Other Spectroscopic Techniques
Authors: I. Oikonomou, I. Lappa, D. Daferera, C. Kanakis, L. Kiokakis, K. Skordilis, A. Avramouli, E. Kalli, C. Pappas, P. A. Tarantilis, E. Skotti
Abstract:
Grape seeds are waste products of wineries and often referred to as an important agricultural and industrial waste product with the potential to be used in pharmaceutical, food, and cosmetic applications. In this study, grape seed oil from traditional Ionian varieties was examined for the determination of the quality and the characteristics of each variety. Initially, the fatty acid methyl ester (FAME) profiles were analyzed using Gas Chromatography-Mass Spectrometry, after transesterification. Furthermore, other quality parameters of the grape seed oils were determined by Spectroscopy techniques, UV-Vis and Raman included. Moreover, the antioxidant capacity of the oil was measured by 2,2'-azino-bis-3-ethylbenzothiazoline-6-sulfonic acid (ABTS) and 2,2-Diphenyl-1-picrylhydrazyl (DPPH) assays and their antioxidant capacity expressed in Trolox equivalents. K and ΔΚ indices were measured in 232, 268, 270 nm, as an oil quality index. The results indicate that the air-dried grape seed total oil content ranged from 5.26 to 8.77% w/w, which is in accordance with the other grape seed varieties tested in similar studies. The composition of grape seed oil is predominated with linoleic and oleic fatty acids, with the linoleic fatty acid ranging from 53.68 to 69.95% and both the linoleic and oleic fatty acids totaling 78-82% of FAMEs, which is analogous to the fatty acid composition of safflower oil. The antioxidant assays ABTS and DPPH scored high, exhibiting that the oils have potential in the cosmetic and culinary businesses. Above that, our results demonstrate that Ionian grape seed oils have prospects that can go further than cosmetic or culinary use, into the pharmaceuticals industry. Finally, the reclamation of grape seeds from wineries waste stream is in accordance with the bio-economy strategic framework and contributes to environmental protection.Keywords: antioxidant capacity, fatty acid methyl esters, grape seed oil, GC-MS
Procedia PDF Downloads 2075501 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 3105500 Self-Organizing Maps for Credit Card Fraud Detection
Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng
Abstract:
This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies
Procedia PDF Downloads 605499 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score
Authors: Jianfeng Hu
Abstract:
Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes
Procedia PDF Downloads 2875498 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 4335497 Elucidating Microstructural Evolution Mechanisms in Tungsten via Layerwise Rolling in Additive Manufacturing: An Integrated Simulation and Experimental Approach
Authors: Sadman Durlov, Aditya Ganesh-Ram, Hamidreza Hekmatjou, Md Najmus Salehin, Nora Shayesteh Ameri
Abstract:
In the field of additive manufacturing, tungsten stands out for its exceptional resistance to high temperatures, making it an ideal candidate for use in extreme conditions. However, its inherent brittleness and vulnerability to thermal cracking pose significant challenges to its manufacturability. This study explores the microstructural evolution of tungsten processed through layer-wise rolling in laser powder bed fusion additive manufacturing, utilizing a comprehensive approach that combines advanced simulation techniques with empirical research. We aim to uncover the complex processes of plastic deformation and microstructural transformations, with a particular focus on the dynamics of grain size, boundary evolution, and phase distribution. Our methodology employs a combination of simulation and experimental data, allowing for a detailed comparison that elucidates the key mechanisms influencing microstructural alterations during the rolling process. This approach facilitates a deeper understanding of the material's behavior under additive manufacturing conditions, specifically in terms of deformation and recrystallization. The insights derived from this research not only deepen our theoretical knowledge but also provide actionable strategies for refining manufacturing parameters to improve the tungsten components' mechanical properties and functional performance. By integrating simulation with practical experimentation, this study significantly enhances the field of materials science, offering a robust framework for the development of durable materials suited for challenging operational environments. Our findings pave the way for optimizing additive manufacturing techniques and expanding the use of tungsten across various demanding sectors.Keywords: additive manufacturing, layer wise rolling, refractory materials, in-situ microstructure modifications
Procedia PDF Downloads 625496 Environmental Decision Making Model for Assessing On-Site Performances of Building Subcontractors
Authors: Buket Metin
Abstract:
Buildings cause a variety of loads on the environment due to activities performed at each stage of the building life cycle. Construction is the first stage that affects both the natural and built environments at different steps of the process, which can be defined as transportation of materials within the construction site, formation and preparation of materials on-site and the application of materials to realize the building subsystems. All of these steps require the use of technology, which varies based on the facilities that contractors and subcontractors have. Hence, environmental consequences of the construction process should be tackled by focusing on construction technology options used in every step of the process. This paper presents an environmental decision-making model for assessing on-site performances of subcontractors based on the construction technology options which they can supply. First, construction technologies, which constitute information, tools and methods, are classified. Then, environmental performance criteria are set forth related to resource consumption, ecosystem quality, and human health issues. Finally, the model is developed based on the relationships between the construction technology components and the environmental performance criteria. The Fuzzy Analytical Hierarchy Process (FAHP) method is used for weighting the environmental performance criteria according to environmental priorities of decision-maker(s), while the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used for ranking on-site environmental performances of subcontractors using quantitative data related to the construction technology components. Thus, the model aims to provide an insight to decision-maker(s) about the environmental consequences of the construction process and to provide an opportunity to improve the overall environmental performance of construction sites.Keywords: construction process, construction technology, decision making, environmental performance, subcontractor
Procedia PDF Downloads 2495495 Self-Organizing Maps for Credit Card Fraud Detection and Visualization
Authors: Peng Chun-Yi, Chen Wei-Hsuan, Ueng Shyh-Kuang
Abstract:
This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies
Procedia PDF Downloads 625494 Information Visualization Methods Applied to Nanostructured Biosensors
Authors: Osvaldo N. Oliveira Jr.
Abstract:
The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique
Procedia PDF Downloads 3395493 Development and Implementation of a Business Technology Program Based on Techniques for Reusing Water in a Colombian Company
Authors: Miguel A. Jimenez Barros, Elyn L. Solano Charris, Luis E. Ramirez, Lauren Castro Bolano, Carlos Torres Barreto, Juliana Morales Cubillo
Abstract:
This project sought to mitigate the high levels of water consumption in industrial processes in accordance with the water-rationing plan promoted at national and international level due to the water consumption projections published by the United Nations. Water consumption has three main uses, municipal (common use), agricultural and industrial where the latter consumes a minimum percentage (around 20% of the total consumption). Awareness on world water scarcity, a Colombian company responsible for generation of massive consumption products, decided to implement politics and techniques for water treatment, recycling, and reuse. The project consisted in a business technology program that permits a better use of wastewater caused by production operations. This approach reduces the potable water consumption, generates better conditions of water in the sewage dumps, generates a positive environmental impact for the region, and is a reference model in national and international levels. In order to achieve the objective, a process flow diagram was used in order to define the industrial processes that required potable water. This strategy allowed the industry to determine a water reuse plan at the operational level without affecting the requirements associated with the manufacturing process and even more, to support the activities developed in administrative buildings. Afterwards, the company made an evaluation and selection of the chemical and biological processes required for water reuse, in compliance with the Colombian Law. The implementation of the business technology program optimized the water use and recirculation rate up to 70%, accomplishing an important reduction of the regional environmental impact.Keywords: bio-reactor, potable water, reverse osmosis, water treatment
Procedia PDF Downloads 2395492 Rejuvenation of Aged Kraft-Cellulose Insulating Paper Used in Transformers
Authors: Y. Jeon, A. Bissessur, J. Lin, P. Ndungu
Abstract:
Most transformers employ the usage of cellulose paper, which has been chemically modified through the Kraft process that acts as an effective insulator. Cellulose ageing and oil degradation are directly linked to fouling of the transformer and accumulation of large quantities of waste insulating paper. In addition to technical difficulties, this proves costly for power utilities to deal with. Currently there are no cost effective method for the rejuvenation of cellulose paper that has been documented nor proposed, since renewal of used insulating paper is implemented as the best option. This study proposes and contrasts different rejuvenation methods of accelerated aged cellulose insulating paper by chemical and bio-bleaching processes. Of the three bleaching methods investigated, two are, conventional chlorine-based sodium hypochlorite (m/v), and chlorine-free hydrogen peroxide (v/v), whilst the third is a bio-bleaching technique that uses a bacterium isolate, Acinetobacter strain V2. Through chemical bleaching, varying the strengths of the bleaching reagents at 0.3 %, 0.6 %, 0.9 %, 1.2 %, 1.5 % and 1.8 % over 4 hrs. were analyzed. Bio-bleaching implemented a bacterium isolate, Acinetobacter strain V2, to bleach the aged Kraft paper over 4 hrs. The determination of the amount of alpha cellulose, degree of polymerization and viscosity carried out on Kraft-cellulose insulating paper before and after bleaching. Overall the investigated techniques of chemical and bio-bleaching were successful and effective in treating degraded and accelerated aged Kraft-cellulose insulating paper, however, to varying extents. Optimum conditions for chemical bleaching were attained at bleaching strengths of 1.2 % (m/v) NaOCl and 1.5 % (v/v) H2O2 yielding alpha cellulose contents of 82.4 % and 80.7 % and degree of polymerizations of 613 and 616 respectively. Bio-bleaching using Acinetobacter strain V2 proved to be the superior technique with alpha cellulose levels of 89.0 % and a degree of polymerization of 620. Chemical bleaching techniques require careful and controlled clean-up treatments as it is chlorine and hydrogen peroxide based while bio-bleaching is an extremely eco-friendly technique.Keywords: alpha cellulose, bio-bleaching, degree of polymerization, Kraft-cellulose insulating paper, transformer, viscosity
Procedia PDF Downloads 2745491 The Prevalence of Organized Retail Crime in Riyadh, Saudi Arabia
Authors: Saleh Dabil
Abstract:
This study investigates the level of existence of organized retail crime in supermarkets of Riyadh, Saudi Arabia. The store managers, security managers and general employees were asked about the types of retail crimes occur in the stores. Three independent variables were related to the report of organized retail theft. The independent variables are: (1) the supermarket profile (volume, location, standard and type of the store), (2) the social physical environment of the store (maintenance, cleanness and overall organizational cooperation), (3) the security techniques and loss prevention electronics techniques used. The theoretical framework of this study based on the social disorganization theory. This study concluded that the organized retail theft, in specific, organized theft is moderately apparent in Riyadh stores. The general result showed that the environment of the stores has an effect on the prevalence of organized retail theft with relation to the gender of thieves, age groups, working shift, type of stolen items as well as the number of thieves in one case. Among other reasons, some factors of the organized theft are: economic pressure of customers based on the location of the store. The dealing of theft also was investigated to have a clear picture of stores dealing with organized retail theft. The result showed that mostly, thieves sent without any action and sometimes given written warning. Very few cases dealt with by police. There are other factors in the study can be looked up in the text. This study suggests solving the problem of organized theft; first is ‘the well distributing of the duties and responsibilities between the employees especially for security purposes’. Second is ‘installation of strong security system’ and ‘making well-designed store layout’. Third is ‘giving training for general employees’ and ‘to give periodically security skills training of employees’. There are other suggestions in the study can be looked up in the text.Keywords: organized crime, retail, theft, loss prevention, store environment
Procedia PDF Downloads 2005490 Lightweight Ceramics from Clay and Ground Corncobs
Authors: N.Quaranta, M. Caligaris, R. Varoli, A. Cristobal, M. Unsen, H. López
Abstract:
Corncobs are agricultural wastes and they can be used as fuel or as raw material in different industrial processes like cement manufacture, contaminant adsorption, chemical compound synthesis, etc. The aim of this work is to characterize this waste and analyze the feasibility of its use as a pore-forming material in the manufacture of lightweight ceramics for the civil construction industry. The characterization of raw materials is carried out by using various techniques: electron diffraction analysis X-ray, differential and gravimetric thermal analyses, FTIR spectroscopy, ecotoxicity evaluation, among others. The ground corncobs, particle size less than 2 mm, are mixed with clay up to 30% in volume and shaped by uniaxial pressure of 25 MPa, with 6% humidity, in moulds of 70mm x 40mm x 18mm. Then the green bodies are heat treated at 950°C for two hours following the treatment curves used in ceramic industry. The ceramic probes are characterized by several techniques: density, porosity and water absorption, permanent volumetric variation, loss on ignition, microscopies analysis, and mechanical properties. DTA-TGA analysis of corncobs shows in the range 20°-250°C a small loss in TGA curve and exothermic peaks at 250°-500°C. FTIR spectrum of the corncobs sample shows the characteristic pattern of this kind of organic matter with stretching vibration bands of adsorbed water, methyl groups, C–O and C–C bonds, and the complex form of the cellulose and hemicellulose glycosidic bonds. The obtained ceramic bodies present external good characteristics without loose edges and adequate properties for the market requirements. The porosity values of the sintered pieces are higher than those of the reference sample without waste addition. The results generally indicate that it is possible to use corncobs as porosity former in ceramic bodies without modifying the usual sintering temperatures employed in the industry.Keywords: ceramic industry, biomass, recycling, hemicellulose glycosidic bonds
Procedia PDF Downloads 4065489 Flood Management Plans in Different Flooding Zones of Gujranwala and Rawalpindi Divisions, Punjab, Pakistan
Authors: Muhammad Naveed
Abstract:
In this paper, flood issues in Gujranwala and Rawalpindi divisions are discussed as a primary importance as these zones are affected continuously from flooding in recent years, provincial variability of the issue, introduce status of the continuous administration measures, their adequacy and future needs in flood administration are secured. Flood issues in these zones are exhibited by Chenab River Basin, Jhelum Rivers Basin. Some unique problems, related to floods in these divisions is lack of major dams on Chenab and Jhelum rivers and also mismanagement of rivers and canal water like dam break stream, and water signing in Tal zones, are additionally mentioned. There are major Nalaas in these regions like Nalaa Lai of Rawalpindi and Nalaa Daik, Nalaa Palkhu, Nalaa Aik of Gujranwala are major cause of floods in these regions other than rivers. Proper management of these Nalaas and moving of nearby population well in time could reduce impacts from flood in these regions. Progress of different flood administration measures, both auxiliary and non-basic, are discussed. Likewise, future needs to accomplish proficient and fruitful flood management measures in Pakistan are additionally brought up. In this paper, we describe different hard and soft engineering techniques to overcome flood situations in these zones as these zones are more vulnerable due to lack of management in canal and river water. Effective management and use of hard and soft techniques are need of time in coming future for controlling greater flooding in flood risk zones to overcome or minimize people’s death as well as agricultural and financial resources as flood and other natural disasters are a major drawback in the economic prosperity of the country.Keywords: flood management, rivers, major dams, agricultural and financial loss, future management and control
Procedia PDF Downloads 2005488 Research and Application of Multi-Scale Three Dimensional Plant Modeling
Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao
Abstract:
Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition
Procedia PDF Downloads 2795487 Monitoring Deforestation Using Remote Sensing And GIS
Authors: Tejaswi Agarwal, Amritansh Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection
Procedia PDF Downloads 12105486 NDVI as a Measure of Change in Forest Biomass
Authors: Amritansh Agarwal, Tejaswi Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI change detection
Procedia PDF Downloads 4045485 Experimental Study Damage in a Composite Structure by Vibration Analysis- Glass / Polyester
Authors: R. Abdeldjebar, B. Labbaci, L. Missoum, B. Moudden, M. Djermane
Abstract:
The basic components of a composite material made him very sensitive to damage, which requires techniques for detecting damage reliable and efficient. This work focuses on the detection of damage by vibration analysis, whose main objective is to exploit the dynamic response of a structure to detect understand the damage. The experimental results are compared with those predicted by numerical models to confirm the effectiveness of the approach.Keywords: experimental, composite, vibration analysis, damage
Procedia PDF Downloads 6755484 Seismic Inversion for Geothermal Exploration
Authors: E. N. Masri, E. Takács
Abstract:
Amplitude Versus Offset (AVO) and simultaneous model-based impedance inversion techniques have not been utilized for geothermal exploration commonly; however, some recent publications called the attention that they can be very useful in the geothermal investigations. In this study, we present rock physical attributes obtained from 3D pre-stack seismic data and well logs collected in a study area of the NW part of Pannonian Basin where the geothermal reservoir is located in the fractured zones of Triassic basement and it was hit by three productive-injection well pairs. The holes were planned very successfully based on the conventional 3D migrated stack volume prior to this study. Subsequently, the available geophysical-geological datasets provided a great opportunity to test modern inversion procedures in the same area. In this presentation, we provide a summary of the theory and application of the most promising seismic inversion techniques from the viewpoint of geothermal exploration. We demonstrate P- and S-wave impedance, as well as the velocity (Vp and Vs), the density, and the Vp/Vs ratio attribute volumes calculated from the seismic and well-logging data sets. After a detailed discussion, we conclude that P-wave impedance and Vp/Vp ratio are the most helpful parameters for lithology discrimination in the study area. They detect the hot water saturated fracture zone very well thus they can be very useful in mapping the investigated reservoir. Integrated interpretation of all the obtained rock-physical parameters is essential. We are extending the above discussed pre-stack seismic tools by studying the possibilities of Elastic Impedance Inversion (EII) for geothermal exploration. That procedure provides two other useful rock-physical properties, the compressibility and the rigidity (Lamé parameters). Results of those newly created elastic parameters will also be demonstrated in the presentation. Geothermal extraction is of great interest nowadays; and we can adopt several methods have been successfully applied in the hydrocarbon exploration for decades to discover new reservoirs and reduce drilling risk and cost.Keywords: fractured zone, seismic, well-logging, inversion
Procedia PDF Downloads 1305483 A Reading Light That Can Adjust Indoor Light Intensity According to the Activity and Person for Improve Indoor Visual Comfort of Occupants and Tested using Post-occupancy Evaluation Techniques for Sri Lankan Population
Authors: R.T.P. De Silva, T. K. Wijayasiriwardhane, B. Jayawardena
Abstract:
Most people nowadays spend their time indoor environment. Because of that, a quality indoor environment needs for them. This study was conducted to identify how to improve indoor visual comfort using a personalized light system. Light intensity, light color, glare, and contrast are the main facts that affect visual comfort. The light intensity which needs to perform a task is changed according to the task. Using necessary light intensity and we can improve the visual comfort of occupants. The hue can affect the emotions of occupants. The preferred light colors and intensity change according to the occupant's age and gender. The research was conducted to identify is there any relationship between personalization and visual comfort. To validate this designed an Internet of Things-based reading light. This light can work according to the standard light levels and personalized light levels. It also can measure the current light intensity of the environment and maintain continuous light levels according to the task. The test was conducted by using 25 undergraduates, and 5school students, and 5 adults. The feedbacks are gathered using Post-occupancy evaluation (POE) techniques. Feedbacks are gathered in three steps, It was done without any light control, with standard light level, and with personalized light level Users had to spend 10 minutes under each condition. After finishing each step, collected their feedbacks. According to the result gathered, 94% of participants rated a personalized light system as comfort for them. The feedbacks show stay under continuous light level help to keep their concentrate. Future research can be conducted on how the color of indoor light can affect for indoor visual comfort of occupants using a personalized light system. Further proposed IoT based can improve to change the light colors according to the user's preference.Keywords: indoor environment quality, internet of things based light system, post occupancy evaluation, visual comfort
Procedia PDF Downloads 1585482 A New Graph Theoretic Problem with Ample Practical Applications
Authors: Mehmet Hakan Karaata
Abstract:
In this paper, we first coin a new graph theocratic problem with numerous applications. Second, we provide two algorithms for the problem. The first solution is using a brute-force techniques, whereas the second solution is based on an initial identification of the cycles in the given graph. We then provide a correctness proof of the algorithm. The applications of the problem include graph analysis, graph drawing and network structuring.Keywords: algorithm, cycle, graph algorithm, graph theory, network structuring
Procedia PDF Downloads 3915481 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques
Authors: Jonathan J. Burson
Abstract:
With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis
Procedia PDF Downloads 995480 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans
Authors: Tomas Premoli, Sareh Rowlands
Abstract:
In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI
Procedia PDF Downloads 765479 Reliability Analysis in Power Distribution System
Authors: R. A. Deshpande, P. Chandhra Sekhar, V. Sankar
Abstract:
In this paper, we discussed the basic reliability evaluation techniques needed to evaluate the reliability of distribution systems which are applied in distribution system planning and operation. Basically, the reliability study can also help to predict the reliability performance of the system after quantifying the impact of adding new components to the system. The number and locations of new components needed to improve the reliability indices to certain limits are identified and studied.Keywords: distribution system, reliability indices, urban feeder, rural feeder
Procedia PDF Downloads 7775478 Anterior Tooth Misalignment: Orthodontics or Restorative Treatment
Authors: Maryam Firouzmandi, Moosa Miri
Abstract:
Smile is considered to be one of the most effective methods of influencing people. Increasing numbers of patients are requesting cosmetic dental procedures to achieve the perfect smile. Based on the patient’s age, oral and facial characteristics, and the dentist’s expertise, different concepts of treatment would be available. Orthodontics is the most conservative and the ideal treatment alternative for crowded anterior teeth; however, it may be rejected by patients due to occupational limitations of time, physical discomfort including pain and functional limitations, psychological discomfort, and appearance during treatment. In addition, orthodontic treatment will not resolve deficits of contour and color of the anterior teeth. In consequence, patients may demand restorative techniques to resolve their anterior mal-alignment instead, often called "instant orthodontics". Following its introduction, however, adhesive dentistry has suffered at times from overuse. Creating short-term attractive smiles at the expense of long-term dental health and optimal tooth biomechanics by using cosmetic techniques should not be considered an ethical approach. The objective of this narrative review was to investigate the literature for guidelines with regard to decision making and treatment planning for anterior tooth mal-alignment. In this regard, indications of orthodontic, restorative, combination of both treatments, and adjunctive periodontal surgery were discussed in clinical cases to achieve a proportional smile. Restorative modalities would include disking, cosmetic contouring, veneers, and crowns and were compared with limited or comprehensive orthodontic options. A rapid review was also presented on pros and cons of snap on smile to mask malalignments. Diagnostic tools such as mock up, wax up, and digital smile design were also considered to achieve more conservative and functional treatments with respect to biologic factors.Keywords: crowding, misalignment, veneer, crown, orthodontics
Procedia PDF Downloads 1175477 Surveying Apps in Dam Excavation
Authors: Ali Mohammadi
Abstract:
Whenever there is a need to dig the ground, the presence of a surveyor is required to control the map. In projects such as dams and tunnels, these controls are more important because any mistakes can increase the cost. Also, time is great importance in These projects have and one of the ways to reduce the drilling time is to use techniques that can reduce the mapping time in these projects. Nowadays, with the existence of mobile phones, we can design apps that perform calculations and drawing for us on the mobile phone. Also, if we have a device that requires a computer to access its information, by designing an app, we can transfer its information to the mobile phone and use it, so we will not need to go to the office.Keywords: app, tunnel, excavation, dam
Procedia PDF Downloads 725476 Sequence Component-Based Adaptive Protection for Microgrids Connected Power Systems
Authors: Isabelle Snyder
Abstract:
Microgrid protection presents challenges to conventional protection techniques due to the low induced fault current. Protection relays present in microgrid applications require a combination of settings groups to adjust based on the architecture of the microgrid in islanded and grid-connected mode. In a radial system where the microgrid is at the other end of the feeder, directional elements can be used to identify the direction of the fault current and switch settings groups accordingly (grid connected or microgrid connected). However, with multiple microgrid connections, this concept becomes more challenging, and the direction of the current alone is not sufficient to identify the source of the fault current contribution. ORNL has previously developed adaptive relaying schemes through other DOE-funded research projects that will be evaluated and used as a baseline for this research. The four protection techniques in this study are the following: (1) Adaptive Current only Protection System (ACPS), Intentional (2) Unbalanced Control for Protection Control (IUCPC), (3) Adaptive Protection System with Communication Controller (APSCC) (4) Adaptive Model-Driven Protective Relay (AMDPR). The first two methods focus on identifying the islanded mode without communication by monitoring the current sequence component generated by the system (ACPS) or induced with inverter control during islanded mode (IUCPC) to identify the islanding condition without communication at the relay to adjust the settings. These two methods are used as a backup to the APSCC, which relies on a communication network to communicate the islanded configuration to the system components. The fourth method relies on a short circuit model inside the relay that is used in conjunction with communication to adjust the system configuration and computes the fault current and adjusts the settings accordingly.Keywords: adaptive relaying, microgrid protection, sequence components, islanding detection, communication controlled protection, integrated short circuit model
Procedia PDF Downloads 975475 Strengthening of Column Using Steel Fiber Reinforced Self-Compacting Concrete
Authors: Sajid Khan, Xu Zhao-Dong
Abstract:
The reinforced concrete members of old structures must be urgently restored and strengthened in order to prolong their service life. Opting for demolition or reconstruction is often impractical and time-consuming. Among the RC members responsible for bearing loads, compression members play a critical role in structural integrity. To increase the ability of existing structures to carry loads, engineers have been employing a variety of strengthening techniques. One promising method involves incorporating micro steel fibers into a cementitious composite. This approach yields high-strength cementitious composites that effectively reinforce existing structures. Specifically, the focus is on developing a self-compacting concrete composite reinforced with steel fibers, commonly known as SFRSCC. The key advantage of SFRSCC is its ability to minimize additional load imposed during application, thereby eliminating the need for time-consuming and cumbersome vibrators. This study's major objective is to examine how to produce a strong SFRSCC that is specifically designed for small-scale columns to increase such columns' load-carrying capability. Following the application of a strengthening layer of SFRSCC to these columns, their strength exhibited remarkable improvement compared to the reference concrete columns. Strength gains of 49%, 66%, 81%, and 89% were attained by the strengthening layers of 0.5” inches, 1” inches, 1.5” inches, and 2” inches respectively. The obtained results are highly encouraging, demonstrating the substantial enhancements in strength that can be achieved using SFRSCC. This SFRSCC concrete composite holds great potential for a wide range of strengthening and repair applications in various structures, presenting a cost-effective and efficient solution for enhancing structural performance and extending the service life of aging buildings.Keywords: coulmn, earth quack, FRSCC, strengthening techniques
Procedia PDF Downloads 10