Search results for: manual areal profiling technique
6591 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects
Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta
Abstract:
Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect
Procedia PDF Downloads 2146590 Microscopic Analysis of Bulk, High-Tc Superconductors by Transmission Kikuchi Diffraction
Authors: Anjela Koblischka-Veneva, Michael R. Koblischka
Abstract:
In this contribution, the Transmission-Kikuchi Diffraction (TKD, or sometimes called t-EBSD) is applied to bulk, melt-grown YBa₂Cu₃O₇ (YBCO) superconductors prepared by the MTMG (melt-textured melt-grown) technique and the infiltration growth (IG) technique. TEM slices required for the analysis were prepared by means of Focused Ion-Beam (FIB) milling using mechanically polished sample surfaces, which enable a proper selection of the interesting regions for investigations. The required optical transparency was reached by an additional polishing step of the resulting surfaces using FIB-Ga-ion and Ar-ion milling. The improved spatial resolution of TKD enabled the investigation of the tiny YBa₂Cu₃O₅ (Y-211) particles having a diameter of about 50-100 nm embedded within the YBCO matrix and of other added secondary phase particles. With the TKD technique, the microstructural properties of the YBCO matrix are studied in detail. It is observed that the matrix shows the effects of stress/strain, depending on the size and distribution of the embedded particles, which are important for providing additional flux pinning centers in such superconducting bulk samples. Using the Kernel Average Misorientation (KAM) maps, the strain induced in the superconducting matrix around the particles, which increases the flux pinning effectivity, can be clearly revealed. This type of analysis of the EBSD/TKD data is, therefore, also important for other material systems, where nanoparticles are embedded in a matrix.Keywords: transmission Kikuchi diffraction, EBSD, TKD, embedded particles, superconductors YBa₂Cu₃O₇
Procedia PDF Downloads 1356589 Modelling the Antecedents of Supply Chain Enablers in Online Groceries Using Interpretive Structural Modelling and MICMAC Analysis
Authors: Rose Antony, Vivekanand B. Khanapuri, Karuna Jain
Abstract:
Online groceries have transformed the way the supply chains are managed. These are facing numerous challenges in terms of product wastages, low margins, long breakeven to achieve and low market penetration to mention a few. The e-grocery chains need to overcome these challenges in order to survive the competition. The purpose of this paper is to carry out a structural analysis of the enablers in e-grocery chains by applying Interpretive Structural Modeling (ISM) and MICMAC analysis in the Indian context. The research design is descriptive-explanatory in nature. The enablers have been identified from the literature and through semi-structured interviews conducted among the managers having relevant experience in e-grocery supply chains. The experts have been contacted through professional/social networks by adopting a purposive snowball sampling technique. The interviews have been transcribed, and manual coding is carried using open and axial coding method. The key enablers are categorized into themes, and the contextual relationship between these and the performance measures is sought from the Industry veterans. Using ISM, the hierarchical model of the enablers is developed and MICMAC analysis identifies the driver and dependence powers. Based on the driver-dependence power the enablers are categorized into four clusters namely independent, autonomous, dependent and linkage. The analysis found that information technology (IT) and manpower training acts as key enablers towards reducing the lead time and enhancing the online service quality. Many of the enablers fall under the linkage cluster viz., frequent software updating, branding, the number of delivery boys, order processing, benchmarking, product freshness and customized applications for different stakeholders, depicting these as critical in online food/grocery supply chains. Considering the perishability nature of the product being handled, the impact of the enablers on the product quality is also identified. Hence, study aids as a tool to identify and prioritize the vital enablers in the e-grocery supply chain. The work is perhaps unique, which identifies the complex relationships among the supply chain enablers in fresh food for e-groceries and linking them to the performance measures. It contributes to the knowledge of supply chain management in general and e-retailing in particular. The approach focus on the fresh food supply chains in the Indian context and hence will be applicable in developing economies context, where supply chains are evolving.Keywords: interpretive structural modelling (ISM), India, online grocery, retail operations, supply chain management
Procedia PDF Downloads 2046588 Reversible and Adaptive Watermarking for MRI Medical Images
Authors: Nisar Ahmed Memon
Abstract:
A new medical image watermarking scheme delivering high embedding capacity is presented in this paper. Integer Wavelet Transform (IWT), Companding technique and adaptive thresholding are used in this scheme. The proposed scheme implants, recovers the hidden information and restores the input image to its pristine state at the receiving end. Magnetic Resonance Imaging (MRI) images are used for experimental purposes. The scheme first segment the MRI medical image into non-overlapping blocks and then inserts watermark into wavelet coefficients having a high frequency of each block. The scheme uses block-based watermarking adopting iterative optimization of threshold for companding in order to avoid the histogram pre and post processing. Results show that proposed scheme performs better than other reversible medical image watermarking schemes available in literature for MRI medical images.Keywords: adaptive thresholding, companding technique, data authentication, reversible watermarking
Procedia PDF Downloads 2966587 Prediction of SOC Stock using ROTH-C Model and Mapping in Different Agroclimatic Zones of Tamil Nadu
Authors: R. Rajeswari
Abstract:
An investigation was carried out to know the SOC stock and its change over time in benchmark soils of different agroclimatic zones of Tamil Nadu. Roth.C model was used to assess SOC stock under existing and alternate cropping pattern. Soil map prepared on 1:50,000 scale from Natural Resources Information System (NRIS) employed under satellite data (IRS-1C/1D-PAN sharpened LISS-III image) was used to estimate SOC stock in different agroclimatic zones of Tamil Nadu. Fifteen benchmark soils were selected in different agroclimatic zones of Tamil Nadu based on their land use and the areal extent to assess SOC level and its change overtime. This revealed that, between eleven years of period (1997 - 2007). SOC buildup was higher in soils under horticulture system, followed by soils under rice cultivation. Among different agroclimatic zones of Tamil Nadu hilly zone have the highest SOC stock, followed by north eastern, southern, western, cauvery delta, north western, and high rainfall zone. Although organic carbon content in the soils of North eastern, southern, western, North western, Cauvery delta were less than high rainfall zone, the SOC stock was high. SOC density was higher in high rainfall and hilly zone than other agroclimatic zones of Tamil Nadu. Among low rainfall regions of Tamil Nadu cauvery delta zone recorded higher SOC density. Roth.C model was used to assess SOC stock under existing and alternate cropping pattern in viz., Periyanaickenpalayam series (western zone), Peelamedu series (southern zone), Vallam series (north eastern zone), Vannappatti series (north western zone) and Padugai series (cauvery delta zone). Padugai series recorded higher TOC, BIO, and HUM, followed by Periyanaickenpalayam series, Peelamedu series, Vallam series, and Vannappatti series. Vannappatti and Padugai series develop high TOC, BIO, and HUM under existing cropping pattern. Periyanaickenpalayam, Peelamedu, and Vallam series develop high TOC, BIO, and HUM under alternate cropping pattern. Among five selected soil series, Periyanaickenpalayam, Peelamedu, and Padugai series recorded 0.75 per cent TOC during 2025 and 2018, 2100 and 2035, 2013 and 2014 under existing and alternate cropping pattern, respectively.Keywords: agro climatic zones, benchmark soil, land use, soil organic carbon
Procedia PDF Downloads 956586 Functional Profiling of a Circular RNA from the Huntingtin (HTT) Gene
Authors: Laura Gantley, Vanessa M. Conn, Stuart Webb, Kirsty Kirk, Marta Gabryelska, Duncan Holds, Brett W. Stringer, Simon J. Conn
Abstract:
Trinucleotide repeat disorders comprise ~20 severe, inherited human neuromuscular and neurodegenerative disorders, which are a result of an abnormal expansion of repetitive sequences in the DNA. The most common of these, Huntington’s disease, results from the expansion of the CAG repeat region in exon 1 of the HTT gene via an unknown mechanism. Non-coding RNAs have been implicated in the initiation and progression of many diseases; thus, we focus on one circular RNA (circRNA) molecule arising from non-canonical splicing (back splicing) of HTT pre-mRNA. This circRNA and its mouse orthologue were transgenically overexpressed in human cells (SHSY-5Y and HEK293T) and mouse cells (Mb1), respectively. High-content imaging and flow cytometry demonstrated the overexpression of this circRNA reduces cell proliferation, reduces nuclear size independent of cellular size, and alters cell cycle progression. Analysis of protein by western blot and immunofluorescence demonstrated no change to HTT protein levels but altered nuclear-cytoplasmic distribution without impacting the expansion of the HTT repeat region. As these phenotypic and genotypic changes are found in Huntington’s disease patients, these results may suggest that this circRNA may play a functional role in the progression of Huntington’s disease.Keywords: cell biology, circular RNAs, Huntington’s disease, molecular biology, neurodegenerative disorders
Procedia PDF Downloads 996585 Laboratory Evaluation of Geogrids Used for Stabilizing Soft Subgrades
Authors: Magdi M. E. Zumrawi, Nehla Mansour
Abstract:
This paper aims to assess the efficiency of using geogrid reinforcement for subgrade stabilization. The literature of applying geogrid reinforcement technique for pavements built on soft subgrades and the previous experiences were reviewed. Laboratory tests were conducted on soil reinforced with geogrids in one or several layers. The soil specimens were compacted in four layers with or without geogrid sheets. The California Bearing Ratio (CBR) test, in soaking condition, was performed on natural soil and soil-geogrid specimens. The test results revealed that the CBR value is much affected by the geogrid sheet location and the number of sheets used in the soil specimen. When a geogrid sheet was placed at the 1st layer of the soil, there was an increment of 26% in the CBR value. Moreover, the CBR value was significantly increased by 62% when geogrid sheets were placed at all four layers. The high CBR value is attributed to interface friction and interlock involved in the geogrid/ soil interactions. It could be concluded that geogrid reinforcement is successful and more economical technique.Keywords: geogrid, reinforcement, stabilization, subgrade
Procedia PDF Downloads 3206584 A Geoprocessing Tool for Early Civil Work Notification to Optimize Fiber Optic Cable Installation Cost
Authors: Hussain Adnan Alsalman, Khalid Alhajri, Humoud Alrashidi, Abdulkareem Almakrami, Badie Alguwaisem, Said Alshahrani, Abdullah Alrowaished
Abstract:
Most of the cost of installing a new fiber optic cable is attributed to civil work-trenching-cost. In many cases, information technology departments receive project proposals in their eReview system, but not all projects are visible to everyone. Additionally, if there was no IT scope in the proposed project, it is not likely to be visible to IT. Sometimes it is too late to add IT scope after project budgets have been finalized. Finally, the eReview system is a repository of PDF files for each project, which commits the reviewer to manual work and limits automation potential. This paper details a solution to address the late notification of the eReview system by integrating IT Sites GIS data-sites locations-with land use permit (LUP) data-civil work activity, which is the first step before securing the required land usage authorizations and means no detailed designs for any relevant project before an approved LUP request. To address the manual nature of eReview system, both the LUP System and IT data are using ArcGIS Desktop, which enables the creation of a geoprocessing tool with either Python or Model Builder to automate finding and evaluating potentially usable LUP requests to reduce trenching between two sites in need of a new FOC. To achieve this, a weekly dump was taken from LUP system production data and loaded manually onto ArcMap Desktop. Then a custom tool was developed in model builder, which consisted of a table of two columns containing all the pairs of sites in need of new fiber connectivity. The tool then iterates all rows of this table, taking the sites’ pair one at a time and finding potential LUPs between them, which satisfies the provided search radius. If a group of LUPs was found, an iterator would go through each LUP to find the required civil work between the two sites and the LUP Polyline feature and the distance through the line, which would be counted as cost avoidance if an IT scope had been added. Finally, the tool will export an Excel file named with sites pair, and it will contain as many rows as the number of LUPs, which met the search radius containing trenching and pulling information and cost. As a result, multiple projects have been identified – historical, missed opportunity, and proposed projects. For the proposed project, the savings were about 75% ($750,000) to install a new fiber with the Euclidean distance between Abqaiq GOSP2 and GOSP3 DCOs. In conclusion, the current tool setup identifies opportunities to bundle civil work on single projects at a time and between two sites. More work is needed to allow the bundling of multiple projects between two sites to achieve even more cost avoidance in both capital cost and carbon footprint.Keywords: GIS, fiber optic cable installation optimization, eliminate redundant civil work, reduce carbon footprint for fiber optic cable installation
Procedia PDF Downloads 2196583 Mobile Collaboration Learning Technique on Students in Developing Nations
Authors: Amah Nnachi Lofty, Oyefeso Olufemi, Ibiam Udu Ama
Abstract:
New and more powerful communications technologies continue to emerge at a rapid pace and their uses in education are widespread and the impact remarkable in the developing societies. This study investigates Mobile Collaboration Learning Technique (MCLT) on learners’ outcome among students in tertiary institutions of developing nations (a case of Nigeria students). It examines the significance of retention achievement scores of students taught using mobile collaboration and conventional method. The sample consisted of 120 students using Stratified random sampling method. Three research questions and hypotheses were formulated, and tested at a 0.05 level of significance. A student achievement test (SAT) was made of 40 items of multiple-choice objective type, developed and validated for data collection by professionals. The SAT was administered to students as pre-test and post-test. The data were analyzed using t-test statistic to test the hypotheses. The result indicated that students taught using MCLT performed significantly better than their counterparts using the conventional method of instruction. Also, there was no significant difference in the post-test performance scores of male and female students taught using MCLT. Based on the findings, the following recommendations was made that: Mobile collaboration system be encouraged in the institutions to boost knowledge sharing among learners, workshop and trainings should be organized to train teachers on the use of this technique and that schools and government should formulate policies and procedures towards responsible use of MCLT.Keywords: education, communication, learning, mobile collaboration, technology
Procedia PDF Downloads 2216582 Ensuring Consistency under the Snapshot Isolation
Authors: Carlos Roberto Valêncio, Fábio Renato de Almeida, Thatiane Kawabata, Leandro Alves Neves, Julio Cesar Momente, Mario Luiz Tronco, Angelo Cesar Colombini
Abstract:
By running transactions under the Snapshot isolation we can achieve a good level of concurrency, specially in databases with high-intensive read workloads. However, Snapshot is not immune to all the problems that arise from competing transactions and therefore no serialization warranty exists. We propose in this paper a technique to obtain data consistency with Snapshot by using some special triggers that we named Daemon Triggers. Besides keeping the benefits of the Snapshot isolation, the technique is specially useful for those database systems that do not have an isolation level that ensures serializability, like Firebird and Oracle. We describe all the anomalies that might arise when using the Snapshot isolation and show how to preclude them with Daemon Triggers. Based on the methodology presented here, it is also proposed the creation of a new isolation level: Daemon Snapshot.Keywords: data consistency, serialization, snapshot, isolation
Procedia PDF Downloads 3296581 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization
Authors: Mohamed Othmani, Yassine Khlifi
Abstract:
This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.Keywords: 3d object, optimization, parametrization, polywog wavelets, reconstruction, wavelet networks
Procedia PDF Downloads 2846580 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features
Authors: Stylianos Kampakis
Abstract:
This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.Keywords: neural networks, feature selection, regularization, aggressive reweighting
Procedia PDF Downloads 4556579 Genetic Algorithm and Multi-Parametric Programming Based Cascade Control System for Unmanned Aerial Vehicles
Authors: Dao Phuong Nam, Do Trong Tan, Pham Tam Thanh, Le Duy Tung, Tran Hoang Anh
Abstract:
This paper considers the problem of cascade control system for unmanned aerial vehicles (UAVs). Due to the complicated modelling technique of UAV, it is necessary to separate them into two subsystems. The proposed cascade control structure is a hierarchical scheme including a robust control for inner subsystem based on H infinity theory and trajectory generator using genetic algorithm (GA), outer loop control law based on multi-parametric programming (MPP) technique to overcome the disadvantage of a big amount of calculations. Simulation results are presented to show that the equivalent path has been found and obtained by proposed cascade control scheme.Keywords: genetic algorithm, GA, H infinity, multi-parametric programming, MPP, unmanned aerial vehicles, UAVs
Procedia PDF Downloads 2126578 The Relationship of Brand Value and Perceived Brand Quality in the Television Business: A Case Study of Television Viewers in Bangkok
Authors: Natnicha Hasoontree
Abstract:
The purpose of this paper was to study the relationship between brand value and perceived brand quality of television viewers in Bangkok towards the television business in Thailand. The population included television viewers in Bangkok, Thailand. A probability sampling technique was performed to get a sample group that included 500 respondents. Taro Yamane technique was utilized to get a proper sample size. A five Likert scale questionnaire was designed specifically to investigate brand value and perceived brand quality from the perspectives of television viewers in Bangkok. The findings implied that consumers in Bangkok attached a high importance towards the brand equity of television companies that comprised brand ability, brand reputation, brand credibility, and business ethics. Perceived brand quality received high rank in all aspects.Keywords: brand value, perceived brand quality, television business, television viewers
Procedia PDF Downloads 4376577 A Secure System for Handling Information from Heterogeous Sources
Authors: Shoohira Aftab, Hammad Afzal
Abstract:
Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.Keywords: information integration, semantic data, interoperability, security, access control system
Procedia PDF Downloads 3576576 Mitigation of Interference in Satellite Communications Systems via a Cross-Layer Coding Technique
Authors: Mario A. Blanco, Nicholas Burkhardt
Abstract:
An important problem in satellite communication systems which operate in the Ka and EHF frequency bands consists of the overall degradation in link performance of mobile terminals due to various types of degradations in the link/channel, such as fading, blockage of the link to the satellite (especially in urban environments), intentional as well as other types of interference, etc. In this paper, we focus primarily on the interference problem, and we develop a very efficient and cost-effective solution based on the use of fountain codes. We first introduce a satellite communications (SATCOM) terminal uplink interference channel model that is classically used against communication systems that use spread-spectrum waveforms. We then consider the use of fountain codes, with focus on Raptor codes, as our main mitigation technique to combat the degradation in link/receiver performance due to the interference signal. The performance of the receiver is obtained in terms of average probability of bit and message error rate as a function of bit energy-to-noise density ratio, Eb/N0, and other parameters of interest, via a combination of analysis and computer simulations, and we show that the use of fountain codes is extremely effective in overcoming the effects of intentional interference on the performance of the receiver and associated communication links. We then show this technique can be extended to mitigate other types of SATCOM channel degradations, such as those caused by channel fading, shadowing, and hard-blockage of the uplink signal.Keywords: SATCOM, interference mitigation, fountain codes, turbo codes, cross-layer
Procedia PDF Downloads 3616575 The Study of Security Techniques on Information System for Decision Making
Authors: Tejinder Singh
Abstract:
Information system is the flow of data from different levels to different directions for decision making and data operations in information system (IS). Data can be violated by different manner like manual or technical errors, data tampering or loss of integrity. Security system called firewall of IS is effected by such type of violations. The flow of data among various levels of Information System is done by networking system. The flow of data on network is in form of packets or frames. To protect these packets from unauthorized access, virus attacks, and to maintain the integrity level, network security is an important factor. To protect the data to get pirated, various security techniques are used. This paper represents the various security techniques and signifies different harmful attacks with the help of detailed data analysis. This paper will be beneficial for the organizations to make the system more secure, effective, and beneficial for future decisions making.Keywords: information systems, data integrity, TCP/IP network, vulnerability, decision, data
Procedia PDF Downloads 3076574 Study of Aqueous Solutions: A Dielectric Spectroscopy Approach
Authors: Kumbharkhane Ashok
Abstract:
The time domain dielectric relaxation spectroscopy (TDRS) probes the interaction of a macroscopic sample with a time-dependent electrical field. The resulting complex permittivity spectrum, characterizes amplitude (voltage) and time scale of the charge-density fluctuations within the sample. These fluctuations may arise from the reorientation of the permanent dipole moments of individual molecules or from the rotation of dipolar moieties in flexible molecules, like polymers. The time scale of these fluctuations depends on the sample and its relative relaxation mechanism. Relaxation times range from some picoseconds in low viscosity liquids to hours in glasses, Therefore the DRS technique covers an extensive dynamical process, its corresponding frequency range from 10-4 Hz to 1012 Hz. This inherent ability to monitor the cooperative motion of molecular ensemble distinguishes dielectric relaxation from methods like NMR or Raman spectroscopy which yield information on the motions of individual molecules. An experimental set up for Time Domain Reflectometry (TDR) technique from 10 MHz to 30 GHz has been developed for the aqueous solutions. This technique has been very simple and covers a wide band of frequencies in the single measurement. Dielectric Relaxation Spectroscopy is especially sensitive to intermolecular interactions. The complex permittivity spectra of aqueous solutions have been fitted using Cole-Davidson (CD) model to determine static dielectric constants and relaxation times for entire concentrations. The heterogeneous molecular interactions in aqueous solutions have been discussed through Kirkwood correlation factor and excess properties.Keywords: liquid, aqueous solutions, time domain reflectometry
Procedia PDF Downloads 4446573 Comparative Ante-Mortem Studies through Electrochemical Impedance Spectroscopy, Differential Voltage Analysis and Incremental Capacity Analysis on Lithium Ion Batteries
Authors: Ana Maria Igual-Munoz, Juan Gilabert, Marta Garcia, Alfredo Quijano-Lopez
Abstract:
Nowadays, several lithium-ion battery technologies are being commercialized. These chemistries present different properties that make them more suitable for different purposes. However, comparative studies showing the advantages and disadvantages of different chemistries are incomplete or scarce. Different non-destructive techniques are currently being employed to detect how ageing affects the active materials of lithium-ion batteries (LIBs). For instance, electrochemical impedance spectroscopy (EIS) is one of the most employed ones. This technique allows the user to identify the variations on the different resistances present in LIBs. On the other hand, differential voltage analysis (DVA) has shown to be a powerful technique to detect the processes affecting the different capacities present in LIBs. This technique shows variations in the state of health (SOH) and the capacities for one or both electrodes depending on their chemistry. Finally, incremental capacity analysis (ICA) is a widely known technique for being capable of detecting phase equilibria. It reminds of the commonly used cyclic voltamperometry, as it allows detecting some reactions taking place in the electrodes. In these studies, a set of ageing procedures have been applied to commercial batteries of different chemistries (NCA, NMC, and LFP). Afterwards, results of EIS, DVA, and ICA have been used to correlate them with the processes affecting each cell. Ciclability, overpotential, and temperature cycling studies envisage how the charge-discharge rates, cut-off voltage, and operation temperatures affect each chemistry. These studies will serve battery pack manufacturers, as for common battery users, as they will determine the different conditions affecting cells for each of the chemistry. Taking this into account, each cell could be adjusted to the final purpose of the battery application. Last but not least, all the degradation parameters observed are focused to be integrated into degradation models in the future. This fact will allow the implementation of the widely known digital twins to the degradation in LIBs.Keywords: lithium ion batteries, non-destructive analysis, different chemistries, ante-mortem studies, ICA, DVA, EIS
Procedia PDF Downloads 1286572 An AK-Chart for the Non-Normal Data
Authors: Chia-Hau Liu, Tai-Yue Wang
Abstract:
Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data
Procedia PDF Downloads 4226571 Mobile and Hot Spot Measurement with Optical Particle Counting Based Dust Monitor EDM264
Authors: V. Ziegler, F. Schneider, M. Pesch
Abstract:
With the EDM264, GRIMM offers a solution for mobile short- and long-term measurements in outdoor areas and at production sites. For research as well as permanent areal observations on a near reference quality base. The model EDM264 features a powerful and robust measuring cell based on optical particle counting (OPC) principle with all the advantages that users of GRIMM's portable aerosol spectrometers are used to. The system is embedded in a compact weather-protection housing with all-weather sampling, heated inlet system, data logger, and meteorological sensor. With TSP, PM10, PM4, PM2.5, PM1, and PMcoarse, the EDM264 provides all fine dust fractions real-time, valid for outdoor applications and calculated with the proven GRIMM enviro-algorithm, as well as six additional dust mass fractions pm10, pm2.5, pm1, inhalable, thoracic and respirable for IAQ and workplace measurements. This highly versatile instrument performs real-time monitoring of particle number, particle size and provides information on particle surface distribution as well as dust mass distribution. GRIMM's EDM264 has 31 equidistant size channels, which are PSL traceable. A high-end data logger enables data acquisition and wireless communication via LTE, WLAN, or wired via Ethernet. Backup copies of the measurement data are stored in the device directly. The rinsing air function, which protects the laser and detector in the optical cell, further increases the reliability and long term stability of the EDM264 under different environmental and climatic conditions. The entire sample volume flow of 1.2 L/min is analyzed by 100% in the optical cell, which assures excellent counting efficiency at low and high concentrations and complies with the ISO 21501-1standard for OPCs. With all these features, the EDM264 is a world-leading dust monitor for precise monitoring of particulate matter and particle number concentration. This highly reliable instrument is an indispensable tool for many users who need to measure aerosol levels and air quality outdoors, on construction sites, or at production facilities.Keywords: aerosol research, aerial observation, fence line monitoring, wild fire detection
Procedia PDF Downloads 1516570 The Effect of Fixing Kinesiology Tape onto the Plantar Surface during Loading Phase of Gait
Authors: Albert K. Chong, Jasim Ahmed Ali Al-Baghdadi, Peter B. Milburn
Abstract:
Precise capture of plantar 3D surface of the foot at the loading gait phases on rigid substrate was found to be valuable for the assessment of the physiology, health and problems of the feet. Photogrammetry, a precision 3D spatial data capture technique is suitable for this type of dynamic application. In this research, the technique is utilised to study of the effect on the plantar deformation for having a strip of kinesiology tape on the plantar surface while going through the loading phase of gait. For this pilot study, one healthy adult male subject was recruited under the USQ University human research ethics guidelines for this preliminary study. The 3D plantar deformation data of both with and without applying the tape were analysed. The results and analyses are presented together with the detail of the findings.Keywords: gait, human plantar, plantar loading, photogrammetry, kinesiology tape
Procedia PDF Downloads 4946569 Scholastic Ability and Achievement as Predictors of College Performance among Selected Second Year College Students at University of Perpetual Help System DALTA, Calamba
Authors: Shielilo R. Amihan, Ederliza De Jesus
Abstract:
The study determined the predictors of college performance of 2nd Yr students of UPHSD-Calamba. This quantitative study conducted a survey using the Scholastic Abilities Test for Adults (SATA), and the retrieval of entrance examinations results and current General Weighted Average (GWA) of the 242 randomly selected respondents. The mean, Pearson r and multiple regression analyses through SPSS revealed that students are capable of verbal, non-verbal and quantitative reasoning, reading vocabulary, comprehension, math calculation, and writing mechanics but have difficulty in math application and writing composition. The study found out the Scholastic Ability and Achievement, except in mathematics, are significantly related to college performance. It concludes that students with high ability and achievement may perform better in college. However, only English subset results in the entrance exam predicts the academic success of students in college while SATA and Math entrance exam results do not. The study recommends providing pre-college Math and Writing courses as requisites in college. It also suggests implementing formative curriculum-based enhancement programs on specific priority areas, profiling programs towards informed individual academic decision-making, revising the Entrance Examinations, monitoring the development of the students, and exploring other predictors of college academic performance such as non-cognitive factors.Keywords: scholastic ability, scholastic achievement, entrance exam, college performance
Procedia PDF Downloads 2606568 Material Detection by Phase Shift Cavity Ring-Down Spectroscopy
Authors: Rana Muhammad Armaghan Ayaz, Yigit Uysallı, Nima Bavili, Berna Morova, Alper Kiraz
Abstract:
Traditional optical methods for example resonance wavelength shift and cavity ring-down spectroscopy used for material detection and sensing have disadvantages, for example, less resistance to laser noise, temperature fluctuations and extraction of the required information can be a difficult task like ring downtime in case of cavity ring-down spectroscopy. Phase shift cavity ring down spectroscopy is not only easy to use but is also capable of overcoming the said problems. This technique compares the phase difference between the signal coming out of the cavity with the reference signal. Detection of any material is made by the phase difference between them. By using this technique, air, water, and isopropyl alcohol can be recognized easily. This Methodology has far-reaching applications and can be used in air pollution detection, human breath analysis and many more.Keywords: materials, noise, phase shift, resonance wavelength, sensitivity, time domain approach
Procedia PDF Downloads 1496567 Efficient Position Based Operation Code Authentication
Authors: Hashim Ali, Sheheryar Khan
Abstract:
Security for applications is always been a keen issue of concern. In general, security is to allow access of grant to legal user or to deny non-authorized access to the system. Shoulder surfing is an observation technique to hack an account or to enter into a system. When a malicious observer is capturing or recording the fingers of a user while he is entering sensitive inputs (PIN, Passwords etc.) and may be able to observe user’s password credential. It is very rigorous for a novice user to prevent himself from shoulder surfing or unaided observer in a public place while accessing his account. In order to secure the user account, there are five factors of authentication; they are: “(i) something you have, (ii) something you are, (iii) something you know, (iv) somebody you know, (v) something you process”. A technique has been developed of fifth-factor authentication “something you process” to provide novel approach to the user. In this paper, we have applied position based operational code authentication in such a way to more easy and user friendly to the user.Keywords: shoulder surfing, malicious observer, sensitive inputs, authentication
Procedia PDF Downloads 2726566 Design of Ternary Coatings System to Minimize the Residual Solvent in Polymeric Coatings
Authors: Jyoti Sharma, Raj Kumar Arya
Abstract:
The coatings of homogeneous ternary solution of Poly(styrene)(PS)-Poly(ethyleneglycol)-6000(PEG) Chlorobenzene (CLB) of two different concentrations (5.05%-4.98%-89.97% and 10.05%-5.12%-84.82%) were studied and dried under quiescent conditions. Residual solvent percentage and coatings thickness were calculated by gravimetric weight loss data. Residual solvent remained lower in case of the single thick layer as compared to layer-by-layer assembly technique. The Results suggests the effectiveness of the single thick layer for minimizing the residual solvent. A single thick layer had an initial coating thickness of 1098 µm and the final thickness of 106 µm which is lower as compared to the dried coatings of nearly the same final thickness by layer-by-layer assembly technique.Keywords: films, layer-by-layer assembly, polymeric coatings, ternary system
Procedia PDF Downloads 1826565 Ant Lion Optimization in a Fuzzy System for Benchmark Control Problem
Authors: Leticia Cervantes, Edith Garcia, Oscar Castillo
Abstract:
At today, there are several control problems where the main objective is to obtain the best control in the study to decrease the error in the application. Many techniques can use to control these problems such as Neural Networks, PID control, Fuzzy Logic, Optimization techniques and many more. In this case, fuzzy logic with fuzzy system and an optimization technique are used to control the case of study. In this case, Ant Lion Optimization is used to optimize a fuzzy system to control the velocity of a simple treadmill. The main objective is to achieve the control of the velocity in the control problem using the ALO optimization. First, a simple fuzzy system was used to control the velocity of the treadmill it has two inputs (error and error change) and one output (desired speed), then results were obtained but to decrease the error the ALO optimization was developed to optimize the fuzzy system of the treadmill. Having the optimization, the simulation was performed, and results can prove that using the ALO optimization the control of the velocity was better than a conventional fuzzy system. This paper describes some basic concepts to help to understand the idea in this work, the methodology of the investigation (control problem, fuzzy system design, optimization), the results are presented and the optimization is used for the fuzzy system. A comparison between the simple fuzzy system and the optimized fuzzy systems are presented where it can be proving the optimization improved the control with good results the major findings of the study is that ALO optimization is a good alternative to improve the control because it helped to decrease the error in control applications even using any control technique to optimized, As a final statement is important to mentioned that the selected methodology was good because the control of the treadmill was improve using the optimization technique.Keywords: ant lion optimization, control problem, fuzzy control, fuzzy system
Procedia PDF Downloads 3996564 Lipidomic Response to Neoadjuvant Chemoradiotherapy in Rectal Cancer
Authors: Patricia O. Carvalho, Marcia C. F. Messias, Salvador Sanchez Vinces, Caroline F. A. Gatinoni, Vitor P. Iordanu, Carlos A. R. Martinez
Abstract:
Lipidomics methods are widely used in the identification and validation of disease-specific biomarkers and therapy response evaluation. The present study aimed to identify a panel of potential lipid biomarkers to evaluate response to neoadjuvant chemoradiotherapy in rectal adenocarcinoma (RAC). Liquid chromatography–mass spectrometry (LC-MS)-based untargeted lipidomic was used to profile human serum samples from patients with clinical stage T2 or T3 resectable RAC, after and before chemoradiotherapy treatment. A total of 28 blood plasma samples were collected from 14 patients with RAC who recruited at the São Francisco University Hospital (HUSF/USF). The study was approved by the ethics committee (CAAE 14958819.8.0000.5514). Univariate and multivariate statistical analyses were applied to explore dysregulated metabolic pathways using untargeted lipidic profiling and data mining approaches. A total of 36 statistically significant altered lipids were identified and the subsequent partial least-squares discriminant analysis model was both cross validated (R2, Q2) and permutated. Lisophosphatidyl-choline (LPC) plasmalogens containing palmitoleic and oleic acids, with high variable importance in projection score, showed a tendency to be lower after completion of chemoradiotherapy. Chemoradiotherapy seems to change plasmanyl-phospholipids levels, indicating that these lipids play an important role in the RAC pathogenesis.Keywords: lipidomics, neoadjuvant chemoradiotherapy, plasmalogens, rectal adenocarcinoma
Procedia PDF Downloads 1316563 Analyses of Soil Volatile Contaminants Extraction by Hot Air Injection
Authors: Abraham Dayan
Abstract:
Remediation of soil containing volatile contaminants is often conducted by vapor extraction (SVE) technique. The operation is based on injection of air at ambient temperatures with or without thermal soil warming. Thermal enhancements of soil vapor extraction (TESVE) processes are usually conducted by soil heating, sometimes assisted by added steam injections. The current study addresses a technique which has not received adequate attention and is based on using exclusively hot air as an alternative to the common TESVE practices. To demonstrate the merit of the hot air TESVE technique, a sandy soil containing contaminated water is studied. Numerical and analytical tools were used to evaluate the rate of decontamination processes for various geometries and operating conditions. The governing equations are based on the Darcy law and are applied to an expanding compressible flow within a sandy soil. The equations were solved to determine the minimal time required for complete soil remediation. An approximate closed form solution was developed based on the assumption of local thermodynamic equilibrium and on a linearized representation of temperature dependence of the vapor to air density ratio. The solution is general in nature and offers insight into the governing processes of the soil remediation operation, where self-similar temperature profiles under certain conditions may exist, and the noticeable role of the contaminants evaporation and recondensation processes in affecting the remediation time. Based on analyses of the hot air TESVE technique, it is shown that it is sufficient to heat the air during a certain period of the decontamination process without compromising its full advantage, and thereby, entailing a minimization of the air-heating-energy requirements. This in effect is achieved by regeneration, leaving the energy stored in the soil during the early period of the remediation process to heat the subsequently injected ambient air, which infiltrates through it for the decontamination of the remaining untreated soil zone. The characteristic time required to complete SVE operations are calculated as a function of, both, the injected air temperature and humidity. For a specific set of conditions, it is demonstrated that elevating the injected air temperature by 20oC, the hot air injection technique reduces the soil remediation time by 50%, while requiring 30% of additional energy consumption. Those evaluations clearly unveil the advantage of the hot air SVE process, which for insignificant cost of added air heating energy, the substantial cost expenditures for manpower and equipment utilization are reduced.Keywords: Porous Media, Soil Decontamination, Hot Air, Vapor Extraction
Procedia PDF Downloads 106562 Graphical User Interface Testing by Using Deep Learning
Authors: Akshat Mathur, Sunil Kumar Khatri
Abstract:
This paper presents brief about how the use of Artificial intelligence in respect to GUI testing can reduce workload by using DL-fueled method. This paper also discusses about how graphical user interface and event driven software testing can derive benefits from the use of AI techniques. The use of AI techniques not only reduces the task and work load but also helps in getting better output than manual testing. Although results are same, but the use of Artifical intelligence techniques for GUI testing has proven to provide ideal results. DL-fueled framework helped us to find imperfections of the entire webpage and provides test failure result in a score format between 0 and 1which signifies that are test meets it quality criteria or not. This paper proposes DL-fueled method which helps us to find the genuine GUI bugs and defects and also helped us to scale the existing labour-intensive and skill-intensive methodologies.Keywords: graphical user interface, GUI, artificial intelligence, deep learning, ML technology
Procedia PDF Downloads 177