Search results for: Multiple Criteria Fuzzy Optimization
754 ADABeV: Automatic Detection of Abnormal Behavior in Video-surveillance
Authors: Nour Charara, Iman Jarkass, Maria Sokhn, Elena Mugellini, Omar Abou Khaled
Abstract:
Intelligent Video-Surveillance (IVS) systems are being more and more popular in security applications. The analysis and recognition of abnormal behaviours in a video sequence has gradually drawn the attention in the field of IVS, since it allows filtering out a large number of useless information, which guarantees the high efficiency in the security protection, and save a lot of human and material resources. We present in this paper ADABeV, an intelligent video-surveillance framework for event recognition in crowded scene to detect the abnormal human behaviour. This framework is attended to be able to achieve real-time alarming, reducing the lags in traditional monitoring systems. This architecture proposal addresses four main challenges: behaviour understanding in crowded scenes, hard lighting conditions, multiple input kinds of sensors and contextual-based adaptability to recognize the active context of the scene.Keywords: Behavior recognition, Crowded scene, Data fusion, Pattern recognition, Video-surveillance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3634753 Identification of Arousal and Relaxation by using SVM-Based Fusion of PPG Features
Authors: Chi Jung Kim, Mincheol Whang, Eui Chul Lee
Abstract:
In this paper, we propose a new method to distinguish between arousal and relaxation states by using multiple features acquired from a photoplethysmogram (PPG) and support vector machine (SVM). To induce arousal and relaxation states in subjects, 2 kinds of sound stimuli are used, and their corresponding biosignals are obtained using the PPG sensor. Two features–pulse to pulse interval (PPI) and pulse amplitude (PA)–are extracted from acquired PPG data, and a nonlinear classification between arousal and relaxation is performed using SVM. This methodology has several advantages when compared with previous similar studies. Firstly, we extracted 2 separate features from PPG, i.e., PPI and PA. Secondly, in order to improve the classification accuracy, SVM-based nonlinear classification was performed. Thirdly, to solve classification problems caused by generalized features of whole subjects, we defined each threshold according to individual features. Experimental results showed that the average classification accuracy was 74.67%. Also, the proposed method showed the better identification performance than the single feature based methods. From this result, we confirmed that arousal and relaxation can be classified using SVM and PPG features.Keywords: Support Vector Machine, PPG, Emotion Recognition, Arousal, Relaxation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484752 A Formal Property Verification for Aspect-Oriented Programs in Software Development
Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb
Abstract:
Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.Keywords: Aspect-oriented programming, control flow graph, satisfiability modulo theories, property verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 751751 Kinetic Modeling of the Fischer-Tropsch Reactions and Modeling Steady State Heterogeneous Reactor
Authors: M. Ahmadi Marvast, M. Sohrabi, H. Ganji
Abstract:
The rate of production of main products of the Fischer-Tropsch reactions over Fe/HZSM5 bifunctional catalyst in a fixed bed reactor is investigated at a broad range of temperature, pressure, space velocity, H2/CO feed molar ratio and CO2, CH4 and water flow rates. Model discrimination and parameter estimation were performed according to the integral method of kinetic analysis. Due to lack of mechanism development for Fisher – Tropsch Synthesis on bifunctional catalysts, 26 different models were tested and the best model is selected. Comprehensive one and two dimensional heterogeneous reactor models are developed to simulate the performance of fixed-bed Fischer – Tropsch reactors. To reduce computational time for optimization purposes, an Artificial Feed Forward Neural Network (AFFNN) has been used to describe intra particle mass and heat transfer diffusion in the catalyst pellet. It is seen that products' reaction rates have direct relation with H2 partial pressure and reverse relation with CO partial pressure. The results show that the hybrid model has good agreement with rigorous mechanistic model, favoring that the hybrid model is about 25-30 times faster.
Keywords: Fischer-Tropsch, heterogeneous modeling, kinetic study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2820750 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: Crime prediction, machine learning, public safety, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1324749 Probabilistic Bhattacharya Based Active Contour Model in Structure Tensor Space
Authors: Hiren Mewada, Suprava Patnaik
Abstract:
Object identification and segmentation application requires extraction of object in foreground from the background. In this paper the Bhattacharya distance based probabilistic approach is utilized with an active contour model (ACM) to segment an object from the background. In the proposed approach, the Bhattacharya histogram is calculated on non-linear structure tensor space. Based on the histogram, new formulation of active contour model is proposed to segment images. The results are tested on both color and gray images from the Berkeley image database. The experimental results show that the proposed model is applicable to both color and gray images as well as both texture images and natural images. Again in comparing to the Bhattacharya based ACM in ICA space, the proposed model is able to segment multiple object too.
Keywords: Active Contour, Bhattacharya Histogram, Structure tensor, Image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2058748 Performance Analysis of an Adaptive Threshold Hybrid Double-Dwell System with Antenna Diversity for Acquisition in DS-CDMA Systems
Authors: H. Krouma, M. Barkat, K. Kemih, M. Benslama, Y. Yacine
Abstract:
In this paper, we consider the analysis of the acquisition process for a hybrid double-dwell system with antenna diversity for DS-CDMA (direct sequence-code division multiple access) using an adaptive threshold. Acquisition systems with a fixed threshold value are unable to adapt to fast varying mobile communications environments and may result in a high false alarm rate, and/or low detection probability. Therefore, we propose an adaptively varying threshold scheme through the use of a cellaveraging constant false alarm rate (CA-CFAR) algorithm, which is well known in the field of radar detection. We derive exact expressions for the probabilities of detection and false alarm in Rayleigh fading channels. The mean acquisition time of the system under consideration is also derived. The performance of the system is analyzed and compared to that of a hybrid single dwell system.Keywords: Adaptive threshold, hybrid double-dwell system, CA-CFAR algorithm, DS-CDMA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718747 Power of Doubling: Population Growth and Resource Consumption
Authors: Sarika Bahadure
Abstract:
Sustainability starts with conserving resources for future generations. Since human’s existence on this earth, he has been consuming natural resources. The resource consumption pace in the past was very slow, but industrialization in 18th century brought a change in the human lifestyle. New inventions and discoveries upgraded the human workforce to machines. The mass manufacture of goods provided easy access to products. In the last few decades, the globalization and change in technologies brought consumer oriented market. The consumption of resources has increased at a very high scale. This overconsumption pattern brought economic boom and provided multiple opportunities, but it also put stress on the natural resources. This paper tries to put forth the facts and figures of the population growth and consumption of resources with examples. This is explained with the help of the mathematical expression of doubling known as exponential growth. It compares the carrying capacity of the earth and resource consumption of humans’ i.e. ecological footprint and bio-capacity. Further, it presents the need to conserve natural resources and re-examine sustainable resource use approach for sustainability.
Keywords: Consumption, exponential growth, population, resources, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1111746 Investigating the Geopolymerization Process of Aluminosilicates and Its Impact on the Compressive Strength of the Produced Geopolymers
Authors: Heba Z. Fouad, Tarek M. Madkour, Safwan A. Khedr
Abstract:
This paper investigates multiple factors that impact the formation of geopolymers and their compressive strength to be utilized in construction as an environmentally-friendly material. Bentonite and Kaolinite were thermally calcinated at 750 °C to obtain Metabentonite and Metakaolinite with higher reactivity. Both source materials were activated using a solution of sodium hydroxide (NaOH). Thereafter, samples were cured at different temperatures. The samples were analyzed chemically using a host of spectroscopic techniques. The bulk density and compressive strength of the produced geopolymer pastes were studied. Findings indicate that the ratio of NaOH solution to source material affects the compressive strength, being optimal at 0.54. Moreover, controlled heat curing was proven effective to improve compressive strength. The existence of characteristic Fourier Transform Infrared Spectroscopy (FTIR) peaks at approximately 1020 cm-1 and 460 cm-1 which correspond to the asymmetric stretching vibration of Si-O-T and bending vibration of Si-O-Si, hence, confirming the formation of the target geopolymer.
Keywords: alcination of metakaolinite, compressive strength, FTIR analysis, geopolymer, green cement
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 385745 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.
Keywords: Security, internet of things, cloud computing, Stackelberg security game, machine learning, Naïve Q-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645744 The Relationship between Adolescent Emotional Inhibition and Depression Disorder: The Moderate Effect of Gender
Authors: Jia-Ru Li, Chih-Hung Wang, Ching-Wen Lin
Abstract:
The association between emotional inhibition strategies linked to depression has been showed inconsistent among studies. Mild emotional inhibition maybe benefit for social interaction, especially for female among East Asian cultures. The present study aimed to examine whether the inhibition–depression relationship is dependent on level of emotion inhibition and gender context, given differing value of suppressing emotional displays. We hypothesized that the negative associations between inhibition and adolescent depression would not directly, in which affected by interaction between emotion inhibition and gender. To test this hypothesis, we asked 309 junior high school students which age range from 12 to14 years old to report on their use of emotion inhibition and depression syndrome. A multiple regressions analysis revealed that significant interaction that gender as a moderator to the relationships between emotion inhibition and adolescent depression. The group with the highest level of depression was girls with high levels of emotion inhibition, whose depression score was higher than that of boys with high levels of emotion inhibition. The result highlights that the importance of context in understanding the inhibition-depression relationship.
Keywords: Emotional inhibition strategies, gender, adolescent depression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2082743 When Explanations “Cause“ Error: A Look at Representations and Compressions
Authors: Michael Lissack
Abstract:
We depend upon explanation in order to “make sense" out of our world. And, making sense is all the more important when dealing with change. But, what happens if our explanations are wrong? This question is examined with respect to two types of explanatory model. Models based on labels and categories we shall refer to as “representations." More complex models involving stories, multiple algorithms, rules of thumb, questions, ambiguity we shall refer to as “compressions." Both compressions and representations are reductions. But representations are far more reductive than compressions. Representations can be treated as a set of defined meanings – coherence with regard to a representation is the degree of fidelity between the item in question and the definition of the representation, of the label. By contrast, compressions contain enough degrees of freedom and ambiguity to allow us to make internal predictions so that we may determine our potential actions in the possibility space. Compressions are explanatory via mechanism. Representations are explanatory via category. Managers are often confusing their evocation of a representation (category inclusion) as the creation of a context of compression (description of mechanism). When this type of explanatory error occurs, more errors follow. In the drive for efficiency such substitutions are all too often proclaimed – at the manager-s peril..Keywords: Coherence, Emergence, Reduction, Model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1239742 Variable Rate Superorthogonal Turbo Code with the OVSF Code Tree
Authors: Insah Bhurtah, P. Clarel Catherine, K. M. Sunjiv Soyjaudah
Abstract:
When using modern Code Division Multiple Access (CDMA) in mobile communications, the user must be able to vary the transmission rate of users to allocate bandwidth efficiently. In this work, Orthogonal Variable Spreading Factor (OVSF) codes are used with the same principles applied in a low-rate superorthogonal turbo code due to their variable-length properties. The introduced system is the Variable Rate Superorthogonal Turbo Code (VRSTC) where puncturing is not performed on the encoder’s final output but rather before selecting the output to achieve higher rates. Due to bandwidth expansion, the codes outperform an ordinary turbo code in the AWGN channel. Simulations results show decreased performance compared to those obtained with the employment of Walsh-Hadamard codes. However, with OVSF codes, the VRSTC system keeps the orthogonality of codewords whilst producing variable rate codes contrary to Walsh-Hadamard codes where puncturing is usually performed on the final output.
Keywords: CDMA, MAP Decoding, OVSF, Superorthogonal Turbo Code.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2176741 Study on the Derivatization Process Using N-O-bis-(trimethylsilyl)-trifluoroacetamide, N-(tert-butyldimethylsilyl)-N-methyltrifluoroace tamide, Trimethylsilydiazomethane for the Determination of Fecal Sterols by Gas Chromatography-Mass Spectrometry
Authors: Jingming Wu, Ruikang Hu, Junqi Yue, Zhaoguang Yang, Lifeng Zhang
Abstract:
Fecal sterol has been proposed as a chemical indicator of human fecal pollution even when fecal coliform populations have diminished due to water chlorination or toxic effects of industrial effluents. This paper describes an improved derivatization procedure for simultaneous determination of four fecal sterols including coprostanol, epicholestanol, cholesterol and cholestanol using gas chromatography-mass spectrometry (GC-MS), via optimization study on silylation procedures using N-O-bis (trimethylsilyl)-trifluoroacetamide (BSTFA), and N-(tert-butyldimethylsilyl)-N-methyltrifluoroacetamide (MTBSTFA), which lead to the formation of trimethylsilyl (TMS) and tert-butyldimethylsilyl (TBS) derivatives, respectively. Two derivatization processes of injection-port derivatization and water bath derivatization (60 oC, 1h) were inspected and compared. Furthermore, the methylation procedure at 25 oC for 2h with trimethylsilydiazomethane (TMSD) for fecal sterols analysis was also studied. It was found that most of TMS derivatives demonstrated the highest sensitivities, followed by methylated derivatives. For BSTFA or MTBSTFA derivatization processes, the simple injection-port derivatization process could achieve the same efficiency as that in the tedious water bath derivatization procedure.Keywords: Fecal Sterols, Methylation, Silylation, BSTFA, MTBSTFA, TMSD, GC-MS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2260740 Validation and Application of a New Optimized RP-HPLC-Fluorescent Detection Method for Norfloxacin
Authors: Mahmood Ahmad, Ghulam Murtaza, Sonia Khiljee, Muhammad Asadullah Madni
Abstract:
A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.
Keywords: Norfloxacin, Aceclofenac sodium, Methodoptimization, RP-HPLC method, Fluorescent detection, Calibrationcurve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105739 Optimization of Double Wishbone Suspension System with Variable Camber Angle by Hydraulic Mechanism
Authors: Mohammad Iman Mokhlespour Esfahani, Masoud Mosayebi, Mohammad Pourshams, Ahmad Keshavarzi
Abstract:
Simulation accuracy by recent dynamic vehicle simulation multidimensional expression significantly has progressed and acceptable results not only for passive vehicles but also for active vehicles normally equipped with advanced electronic components is also provided. Recently, one of the subjects that has it been considered, is increasing the safety car in design. Therefore, many efforts have been done to increase vehicle stability especially in the turn. One of the most important efforts is adjusting the camber angle in the car suspension system. Optimum control camber angle in addition to the vehicle stability is effective in the wheel adhesion on road, reducing rubber abrasion and acceleration and braking. Since the increase or decrease in the camber angle impacts on the stability of vehicles, in this paper, a car suspension system mechanism is introduced that could be adjust camber angle and the mechanism is application and also inexpensive. In order to reach this purpose, in this paper, a passive double wishbone suspension system with variable camber angle is introduced and then variable camber mechanism designed and analyzed for study the designed system performance, this mechanism is modeled in Visual Nastran software and kinematic analysis is revealed.Keywords: Suspension molding, double wishbone, variablecamber, hydraulic mechanism
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7724738 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.
Keywords: Agent-oriented modeling, business Intelligence management, distributed data mining, multi-agent system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374737 Aerodynamic Models for the Analysis of Vertical Axis Wind Turbines (VAWTs)
Authors: T. Brahimi, F. Saeed, I. Paraschivoiu
Abstract:
This paper details the progress made in the development of the different state-of-the-art aerodynamic tools for the analysis of vertical axis wind turbines including the flow simulation around the blade, viscous flow, stochastic wind, and dynamic stall effects. The paper highlights the capabilities of the developed wind turbine aerodynamic codes over the last thirty years which are currently being used in North America and Europe by Sandia Laboratories, FloWind, IMST Marseilles, and Hydro-Quebec among others. The aerodynamic codes developed at Ecole Polytechnique de Montreal, Canada, represent valuable tools for simulating the flow around wind turbines including secondary effects. Comparison of theoretical results with experimental data have shown good agreement. The strength of the aerodynamic codes based on Double-Multiple Stream tube model (DMS) lies in its simplicity, accuracy, and ability to analyze secondary effects that interfere with wind turbine aerodynamic calculations.
Keywords: Aerodynamics, wind turbines, VAWT, CARDAAV, Darrieus, dynamic stall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2601736 Factors Determining the Women Empowerment through Microfinance: An Empirical Study in Sri Lanka
Authors: Y. Rathiranee, D. M. Semasinghe
Abstract:
This study attempts to identify the factors influencing on women empowerment of rural area in Sri Lanka through micro finance services. Data were collected from one hundred (100) rural women involving self-employment activities through a questionnaire using direct personal interviews. Judgment and Convenience Random sampling technique was used to select the sample size from three Divisional Secretariat divisions of Kandawalai, Poonakari and Karachchi in Kilinochchi District. The factor analysis was performed on fourteen (14) variables for screening and reducing the variables to identify the influencing factors on empowerment. Multiple regression analysis was used to identify the relationship between the three empowerment factors and the impact of micro finance on overall empowerment of rural women. The result of this study summarized the variables into three factors namely decision making, freedom to mobility and family support and which are positively associated with empowerment. In addition to this the value of adjusted R2 is 0.248 indicates that all the variables extracted can be explained 24.8% of the variation in the women empowerment through microfinance. Independent variables of these three factors have positive correlation with women empowerment as well as significant values at 5 percent level.Keywords: Influencing factors, Micro finance, rural women and women empowerment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3958735 Study on the Derivatization Process Using N-O-bis-(trimethylsilyl)-trifluoroacetamide,N-(tert-butyldimethylsilyl)-N-methyltrifluoroacetamide, Trimethylsilydiazomethane for the Determination of Fecal Sterols by Gas Chromatography-Mass Spectrometry
Authors: Jingming Wu, Ruikang Hu, Junqi Yue, Zhaoguang Yang, Lifeng Zhang
Abstract:
Fecal sterol has been proposed as a chemical indicator of human fecal pollution even when fecal coliform populations have diminished due to water chlorination or toxic effects of industrial effluents. This paper describes an improved derivatization procedure for simultaneous determination of four fecal sterols including coprostanol, epicholestanol, cholesterol and cholestanol using gas chromatography-mass spectrometry (GC-MS), via optimization study on silylation procedures using N-O-bis (trimethylsilyl)-trifluoroacetamide (BSTFA), and N-(tert-butyldimethylsilyl)-N-methyltrifluoroacetamide (MTBSTFA), which lead to the formation of trimethylsilyl (TMS) and tert-butyldimethylsilyl (TBS) derivatives, respectively. Two derivatization processes of injection-port derivatization and water bath derivatization (60 oC, 1h) were inspected and compared. Furthermore, the methylation procedure at 25 oC for 2h with trimethylsilydiazomethane (TMSD) for fecal sterols analysis was also studied. It was found that most of TMS derivatives demonstrated the highest sensitivities, followed by methylated derivatives. For BSTFA or MTBSTFA derivatization processes, the simple injection-port derivatization process could achieve the same efficiency as that in the tedious water bath derivatization procedure.Keywords: Fecal Sterols, Methylation, Silylation, BSTFA, MTBSTFA, TMSD, GC-MS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3197734 Solving a New Mixed-Model Assembly LineSequencing Problem in a MTO Environment
Authors: N. Manavizadeh, M. Hosseini, M. Rabbani
Abstract:
In the last decades to supply the various and different demands of clients, a lot of manufacturers trend to use the mixedmodel assembly line (MMAL) in their production lines, since this policy make possible to assemble various and different models of the equivalent goods on the same line with the MTO approach. In this article, we determine the sequence of (MMAL) line, with applying the kitting approach and planning of rest time for general workers to reduce the wastages, increase the workers effectiveness and apply the sector of lean production approach. This Multi-objective sequencing problem solved in small size with GAMS22.2 and PSO meta heuristic in 10 test problems and compare their results together and conclude that their results are very similar together, next we determine the important factors in computing the cost, which improving them cost reduced. Since this problem, is NPhard in large size, we use the particle swarm optimization (PSO) meta-heuristic for solving it. In large size we define some test problems to survey it-s performance and determine the important factors in calculating the cost, that by change or improved them production in minimum cost will be possible.Keywords: Mixed-Model Assembly Line, particle swarmoptimization, Multi-objective sequencing problem, MTO system, kitto-assembly, rest time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037733 In Cognitive Radio the Analysis of Bit-Error- Rate (BER) by using PSO Algorithm
Authors: Shrikrishan Yadav, Akhilesh Saini, Krishna Chandra Roy
Abstract:
The electromagnetic spectrum is a natural resource and hence well-organized usage of the limited natural resources is the necessities for better communication. The present static frequency allocation schemes cannot accommodate demands of the rapidly increasing number of higher data rate services. Therefore, dynamic usage of the spectrum must be distinguished from the static usage to increase the availability of frequency spectrum. Cognitive radio is not a single piece of apparatus but it is a technology that can incorporate components spread across a network. It offers great promise for improving system efficiency, spectrum utilization, more effective applications, reduction in interference and reduced complexity of usage for users. Cognitive radio is aware of its environmental, internal state, and location, and autonomously adjusts its operations to achieve designed objectives. It first senses its spectral environment over a wide frequency band, and then adapts the parameters to maximize spectrum efficiency with high performance. This paper only focuses on the analysis of Bit-Error-Rate in cognitive radio by using Particle Swarm Optimization Algorithm. It is theoretically as well as practically analyzed and interpreted in the sense of advantages and drawbacks and how BER affects the efficiency and performance of the communication system.Keywords: BER, Cognitive Radio, Environmental Parameters, PSO, Radio spectrum, Transmission Parameters
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2155732 Method of Parameter Calibration for Error Term in Stochastic User Equilibrium Traffic Assignment Model
Authors: Xiang Zhang, David Rey, S. Travis Waller
Abstract:
Stochastic User Equilibrium (SUE) model is a widely used traffic assignment model in transportation planning, which is regarded more advanced than Deterministic User Equilibrium (DUE) model. However, a problem exists that the performance of the SUE model depends on its error term parameter. The objective of this paper is to propose a systematic method of determining the appropriate error term parameter value for the SUE model. First, the significance of the parameter is explored through a numerical example. Second, the parameter calibration method is developed based on the Logit-based route choice model. The calibration process is realized through multiple nonlinear regression, using sequential quadratic programming combined with least square method. Finally, case analysis is conducted to demonstrate the application of the calibration process and validate the better performance of the SUE model calibrated by the proposed method compared to the SUE models under other parameter values and the DUE model.
Keywords: Parameter calibration, sequential quadratic programming, Stochastic User Equilibrium, traffic assignment, transportation planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127731 Application of Medium High Hydrostatic Pressure in Preserving Textural Quality and Safety of Pineapple Compote
Authors: Nazim Uddin, Yohiko Nakaura, Kazutaka Yamamoto
Abstract:
Compote (fruit in syrup) of pineapple (Ananas comosus L. Merrill) is expected to have a high market potential as one of convenient ready-to-eat (RTE) foods worldwide. High hydrostatic pressure (HHP) in combination with low temperature (LT) was applied to the processing of pineapple compote as well as medium HHP (MHHP) in combination with medium-high temperature (MHT) since both processes can enhance liquid impregnation and inactivate microbes. MHHP+MHT (55 or 65 °C) process, as well as the HHP+LT process, has successfully inactivated the microbes in the compote to a non-detectable level. Although the compotes processed by MHHP+MHT or HHP+LT have lost the fresh texture as in a similar manner as those processed solely by heat, it was indicated that the texture degradations by heat were suppressed under MHHP. Degassing process reduced the hardness, while calcium (Ca) contributed to be retained hardness in MHT and MHHP+MHT processes. Electrical impedance measurement supported the damage due to degassing and heat. The color, Brix, and appearance were not affected by the processing methods significantly. MHHP+MHT and HHP+LT processes may be applicable to produce high-quality, safe RTE pineapple compotes. Further studies on the optimization of packaging and storage condition will be indispensable for commercialization.
Keywords: Compote of pineapple, ready-to-eat, medium high hydrostatic pressure, postharvest loss, and texture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 807730 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.
Keywords: Block matching, digital evidence, hash list.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358729 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents
Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei
Abstract:
With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.
Keywords: Document processing, framework, formal definition, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 638728 Model Predictive Control with Unscented Kalman Filter for Nonlinear Implicit Systems
Authors: Takashi Shimizu, Tomoaki Hashimoto
Abstract:
A class of implicit systems is known as a more generalized class of systems than a class of explicit systems. To establish a control method for such a generalized class of systems, we adopt model predictive control method which is a kind of optimal feedback control with a performance index that has a moving initial time and terminal time. However, model predictive control method is inapplicable to systems whose all state variables are not exactly known. In other words, model predictive control method is inapplicable to systems with limited measurable states. In fact, it is usual that the state variables of systems are measured through outputs, hence, only limited parts of them can be used directly. It is also usual that output signals are disturbed by process and sensor noises. Hence, it is important to establish a state estimation method for nonlinear implicit systems with taking the process noise and sensor noise into consideration. To this purpose, we apply the model predictive control method and unscented Kalman filter for solving the optimization and estimation problems of nonlinear implicit systems, respectively. The objective of this study is to establish a model predictive control with unscented Kalman filter for nonlinear implicit systems.Keywords: Model predictive control, unscented Kalman filter, nonlinear systems, implicit systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948727 SFE as a Superior Technique for Extraction of Eugenol-Rich Fraction from Cinnamomum tamala Nees (Bay Leaf) - Process Analysis and Phytochemical Characterization
Authors: Sudip Ghosh, Dipanwita Roy, Dipan Chatterjee, Paramita Bhattacharjee, Satadal Das
Abstract:
Highest yield of eugenol-rich fractions from Cinnamomum tamala (bay leaf) leaves were obtained by supercritical carbon dioxide (SC-CO2), compared to hydro-distillation, organic solvents, liquid CO2 and subcritical CO2 extractions. Optimization of SC-CO2 extraction parameters was carried out to obtain an extract with maximum eugenol content. This was achieved using a sample size of 10g at 55°C, 512 bar after 60min at a flow rate of 25.0 cm3/sof gaseous CO2. This extract has the best combination of phytochemical properties such as phenolic content (1.77mg gallic acid/g dry bay leaf), reducing power (0.80mg BHT/g dry bay leaf), antioxidant activity (IC50 of 0.20mg/ml) and anti-inflammatory potency (IC50 of 1.89mg/ml). Identification of compounds in this extract was performed by GC-MS analysis and its antimicrobial potency was also evaluated. The MIC values against E. coli, P. aeruginosa and S. aureus were 0.5, 0.25 and 0.5mg/ml, respectively.
Keywords: Antimicrobial potency, Cinnamomum tamala, eugenol, supercritical carbon dioxide extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3628726 The Effect of Glass Thickness on Stress in Vacuum Glazing
Authors: Farid Arya, Trevor Hyde, Andrea Trevisi, Paolo Basso, Danilo Bardaro
Abstract:
Heat transfer through multiple pane windows can be reduced by creating a vacuum pressure less than 0.1 Pa between the glass panes, with low emittance coatings on one or more of the internal surfaces. Fabrication of vacuum glazing (VG) requires the formation of a hermetic seal around the periphery of the glass panes together with an array of support pillars between the panes to prevent them from touching under atmospheric pressure. Atmospheric pressure and temperature differentials induce stress which can affect the integrity of the glazing. Several parameters define the stresses in VG including the glass thickness, pillar specifications, glazing dimensions and edge seal configuration. Inherent stresses in VG can result in fractures in the glass panes and failure of the edge seal. In this study, stress in VG with different glass thicknesses is theoretically studied using Finite Element Modelling (FEM). Based on the finding in this study, suggestions are made to address problems resulting from the use of thinner glass panes in the fabrication of VG. This can lead to the development of high performance, light and thin VG.Keywords: ABAQUS, glazing, stress, vacuum glazing, vacuum insulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 855725 LFC Design of a Deregulated Power System with TCPS Using PSO
Authors: H. Shayeghi, H.A. Shayanfar, A. Jalili
Abstract:
In the LFC problem, the interconnections among some areas are the input of disturbances, and therefore, it is important to suppress the disturbances by the coordination of governor systems. In contrast, tie-line power flow control by TCPS located between two areas makes it possible to stabilize the system frequency oscillations positively through interconnection, which is also expected to provide a new ancillary service for the further power systems. Thus, a control strategy using controlling the phase angle of TCPS is proposed for provide active control facility of system frequency in this paper. Also, the optimum adjustment of PID controller's parameters in a robust way under bilateral contracted scenario following the large step load demands and disturbances with and without TCPS are investigated by Particle Swarm Optimization (PSO), that has a strong ability to find the most optimistic results. This newly developed control strategy combines the advantage of PSO and TCPS and has simple stricture that is easy to implement and tune. To demonstrate the effectiveness of the proposed control strategy a three-area restructured power system is considered as a test system under different operating conditions and system nonlinearities. Analysis reveals that the TCPS is quite capable of suppressing the frequency and tie-line power oscillations effectively as compared to that obtained without TCPS for a wide range of plant parameter changes, area load demands and disturbances even in the presence of system nonlinearities.
Keywords: LFC, TCPS, Dregulated Power System, PowerSystem Control, PSO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069