Search results for: indirect method
17142 A Hybrid Model of Goal, Integer and Constraint Programming for Single Machine Scheduling Problem with Sequence Dependent Setup Times: A Case Study in Aerospace Industry
Authors: Didem Can
Abstract:
Scheduling problems are one of the most fundamental issues of production systems. Many different approaches and models have been developed according to the production processes of the parts and the main purpose of the problem. In this study, one of the bottleneck stations of a company serving in the aerospace industry is analyzed and considered as a single machine scheduling problem with sequence-dependent setup times. The objective of the problem is assigning a large number of similar parts to the same shift -to reduce chemical waste- while minimizing the number of tardy jobs. The goal programming method will be used to achieve two different objectives simultaneously. The assignment of parts to the shift will be expressed using the integer programming method. Finally, the constraint programming method will be used as it provides a way to find a result in a short time by avoiding worse resulting feasible solutions with the defined variables set. The model to be established will be tested and evaluated with real data in the application part.Keywords: constraint programming, goal programming, integer programming, sequence-dependent setup, single machine scheduling
Procedia PDF Downloads 23717141 Classifying Facial Expressions Based on a Motion Local Appearance Approach
Authors: Fabiola M. Villalobos-Castaldi, Nicolás C. Kemper, Esther Rojas-Krugger, Laura G. Ramírez-Sánchez
Abstract:
This paper presents the classification results about exploring the combination of a motion based approach with a local appearance method to describe the facial motion caused by the muscle contractions and expansions that are presented in facial expressions. The proposed feature extraction method take advantage of the knowledge related to which parts of the face reflects the highest deformations, so we selected 4 specific facial regions at which the appearance descriptor were applied. The most common used approaches for feature extraction are the holistic and the local strategies. In this work we present the results of using a local appearance approach estimating the correlation coefficient to the 4 corresponding landmark-localized facial templates of the expression face related to the neutral face. The results let us to probe how the proposed motion estimation scheme based on the local appearance correlation computation can simply and intuitively measure the motion parameters for some of the most relevant facial regions and how these parameters can be used to recognize facial expressions automatically.Keywords: facial expression recognition system, feature extraction, local-appearance method, motion-based approach
Procedia PDF Downloads 41317140 Comparative Analysis of Enzyme Activities Concerned in Decomposition of Toluene
Authors: Ayuko Itsuki, Sachiyo Aburatani
Abstract:
In recent years, pollutions of the environment by toxic substances become a serious problem. While there are many methods of environmental clean-up, the methods by microorganisms are considered to be reasonable and safety for environment. Compost is known that it catabolize the meladorous substancess in its production process, however the mechanism of its catabolizing system is not known yet. In the catabolization process, organic matters turn into inorganic by the released enzymes from lots of microorganisms which live in compost. In other words, the cooperative of activated enzymes in the compost decomposes malodorous substances. Thus, clarifying the interaction among enzymes is important for revealing the catabolizing system of meladorous substance in compost. In this study, we utilized statistical method to infer the interaction among enzymes. We developed a method which combined partial correlation with cross correlation to estimate the relevance between enzymes especially from time series data of few variables. Because of using cross correlation, we can estimate not only the associative structure but also the reaction pathway. We applied the developed method to the enzyme measured data and estimated an interaction among the enzymes in decomposition mechanism of toluene.Keywords: enzyme activities, comparative analysis, compost, toluene
Procedia PDF Downloads 27317139 Optimized Simultaneous Determination of Theobromine and Caffeine in Fermented and Unfermented Cacao Beans and in Cocoa Products Using Step Gradient Solvent System in Reverse Phase HPLC
Authors: Ian Marc G. Cabugsa, Kim Ryan A. Won
Abstract:
Fast, reliable and simultaneous HPLC analysis of theobromine and caffeine in cacao and cocoa products was optimized in this study. The samples tested were raw, fermented, and roasted cacao beans as well as commercially available cocoa products. The HPLC analysis was carried out using step gradient solvent system with acetonitrile and water buffered with H3PO4 as the mobile phase. The HPLC system was optimized using 273 nm wavelength at 35 °C for the column temperature with a flow rate of 1.0 mL/min. Using this method, the theobromine percent recovery mean, Limit of Detection (LOD) and Limit of Quantification (LOQ) is 118.68(±3.38)%, 0.727 and 1.05 respectively. The percent recovery mean, LOD and LOQ for caffeine is 105.53(±3.25)%, 2.42 and 3.50 respectively. The inter-day and intra-day precision for theobromine is 4.31% and 4.48% respectively, while 7.02% and 7.03% was for caffeine respectively. Compared to the standard method in AOAC using methanol in isocratic solvent system, the results of the study produced lesser chromatogram noise with emphasis on theobromine and caffeine. The method is readily usable for cacao and cocoa substances analyses using HPLC with step gradient capability.Keywords: cacao, caffeine, HPLC, step gradient solvent system, theobromine
Procedia PDF Downloads 28117138 Edge Detection in Low Contrast Images
Authors: Koushlendra Kumar Singh, Manish Kumar Bajpai, Rajesh K. Pandey
Abstract:
The edges of low contrast images are not clearly distinguishable to the human eye. It is difficult to find the edges and boundaries in it. The present work encompasses a new approach for low contrast images. The Chebyshev polynomial based fractional order filter has been used for filtering operation on an image. The preprocessing has been performed by this filter on the input image. Laplacian of Gaussian method has been applied on preprocessed image for edge detection. The algorithm has been tested on two test images.Keywords: low contrast image, fractional order differentiator, Laplacian of Gaussian (LoG) method, chebyshev polynomial
Procedia PDF Downloads 63617137 Adoption of Digital Storytelling Tool to Teach 21st Century Skills by Malaysian Pre-service Teachers
Authors: Siti Aisyah binti Jumpaan
Abstract:
21ˢᵗ century skills (PAK-21) integration has made its way into Malaysian curriculum when Ministry of Education introduce its implementation since 2016. This study was conducted to explore pre-service teachers’ readiness in integrating 21st century skills in the classroom via the digital storytelling (DST) method and to find gaps between theory and practice that can be integral towards pre-service teachers’ professional growth. Qualitative research method was used in this research involving six respondents who were selected using a purposive sampling method. Their response from interviews and lesson plan analysis were analysed using narrative analysis. The findings showed that pre-service teachers showed a moderate level of readiness in integrating 21st century skills using DST. Pre-service teachers demonstrated high level of preparedness in writing their lesson plan, but their interview revealed that they faced struggles in implementation due to several factors, such as lack of technology and failure to obtain students’ participation. This study further strengthens the need for specialised curriculum for pre-service teachers in teaching 21st century skills via DST.Keywords: digital storytelling, 21ˢᵗ century skills, preservice teachers, teacher training
Procedia PDF Downloads 9117136 A Novel Method for Isolation of Kaempferol and Quercetin from Podophyllum Hexandrum Rhizome
Authors: S. B. Bhandare, K. S. Laddha
Abstract:
Podphyllum hexandrum belonging to family berberidaceae has gained attention in phytochemical and pharmacological research as it shows excellent anticancer activity and has been used in treatment of skin diseases, sunburns and radioprotection. Chemically it contains lignans and flavonoids such as kaempferol, quercetin and their glycosides. Objective: To isolate and identify Kaempferol and Quercetin from Podophyllum rhizome. Method: The powdered rhizome of Podophyllum hexandrum was subjected to soxhlet extraction with methanol. This methanolic extract is used to obtain podophyllin. Podohyllin was extracted with ethyl acetate and this extract was then concentrated and subjected to column chromatography to obtain purified kaempferol and quercetin. Result: Isolated kaempferol, quercetin were light yellow and dark yellow in colour respectively. TLC of the isolated compounds was performed using chloroform: methanol (9:1) which showed single band on silica plate at Rf 0.6 and 0.4 for kaempferol and quercetin. UV spectrometric studies showed UV maxima (methanol) at 259, 360 nm and 260, 370 nm which are identical with standard kaempferol and quercetin respectively. Both IR spectra exhibited prominent absorption bands for free phenolic OH at 3277 and 3296.2 cm-1 and for conjugated C=O at 1597 and 1659.7 cm-1 respectively. The mass spectrum of kaempferol and quercetin showed (M+1) peak at m/z 287 and 303.09 respectively. 1H NMR analysis of both isolated compounds exhibited typical four-peak pattern of two doublets at δ 6.86 and δ 8.01 which was assigned to H-3’,5’ and H-2’,6’ respectively. Absence of signals less than δ 6.81 in the 1H NMR spectrum supported the aromatic nature of compound. Kaempferol and Quercetin showed 98.1% and 97% purity by HPLC at UV 370 nm. Conclusion: Easy and simple method for isolation of Kaempferol and Quercetin was developed and their structures were confirmed by UV, IR, NMR and mass studies. Method has shown good reproducibility, yield and purity.Keywords: flavonoids, kaempferol, podophyllum rhizome, quercetin
Procedia PDF Downloads 30417135 Hydrodynamic Study and Sizing of a Distillation Column by HYSYS Software
Authors: Derrouazin Mohammed Redhouane, Souakri Mohammed Lotfi, Henini Ghania
Abstract:
This work consists, first of all, of mastering one of the powerful process simulation tools currently used in the industrial processes, which is the HYSYS sizing software, and second, of simulating a petroleum distillation column. This study is divided into two parts; where the first one consists of a dimensioning of the column with a fast approximating method using state equations, iterative calculations, and then a precise simulation method with the HYSYS software. The second part of this study is a hydrodynamic study in order to verify by obtained results the proper functioning of the plates.Keywords: industry process engineering, water distillation, environment, HYSYS simulation tool
Procedia PDF Downloads 12917134 Prediction of Anticancer Potential of Curcumin Nanoparticles by Means of Quasi-Qsar Analysis Using Monte Carlo Method
Authors: Ruchika Goyal, Ashwani Kumar, Sandeep Jain
Abstract:
The experimental data for anticancer potential of curcumin nanoparticles was calculated by means of eclectic data. The optimal descriptors were examined using Monte Carlo method based CORAL SEA software. The statistical quality of the model is following: n = 14, R² = 0.6809, Q² = 0.5943, s = 0.175, MAE = 0.114, F = 26 (sub-training set), n =5, R²= 0.9529, Q² = 0.7982, s = 0.086, MAE = 0.068, F = 61, Av Rm² = 0.7601, ∆R²m = 0.0840, k = 0.9856 and kk = 1.0146 (test set) and n = 5, R² = 0.6075 (validation set). This data can be used to build predictive QSAR models for anticancer activity.Keywords: anticancer potential, curcumin, model, nanoparticles, optimal descriptors, QSAR
Procedia PDF Downloads 31817133 The Effect of Accounting Conservatism on Cost of Capital: A Quantile Regression Approach for MENA Countries
Authors: Maha Zouaoui Khalifa, Hakim Ben Othman, Hussaney Khaled
Abstract:
Prior empirical studies have investigated the economic consequences of accounting conservatism by examining its impact on the cost of equity capital (COEC). However, findings are not conclusive. We assume that inconsistent results of such association may be attributed to the regression models used in data analysis. To address this issue, we re-examine the effect of different dimension of accounting conservatism: unconditional conservatism (U_CONS) and conditional conservatism (C_CONS) on the COEC for a sample of listed firms from Middle Eastern and North Africa (MENA) countries, applying quantile regression (QR) approach developed by Koenker and Basset (1978). While classical ordinary least square (OLS) method is widely used in empirical accounting research, however it may produce inefficient and bias estimates in the case of departures from normality or long tail error distribution. QR method is more powerful than OLS to handle this kind of problem. It allows the coefficient on the independent variables to shift across the distribution of the dependent variable whereas OLS method only estimates the conditional mean effects of a response variable. We find as predicted that U_CONS has a significant positive effect on the COEC however, C_CONS has a negative impact. Findings suggest also that the effect of the two dimensions of accounting conservatism differs considerably across COEC quantiles. Comparing results from QR method with those of OLS, this study throws more lights on the association between accounting conservatism and COEC.Keywords: unconditional conservatism, conditional conservatism, cost of equity capital, OLS, quantile regression, emerging markets, MENA countries
Procedia PDF Downloads 35517132 Reliability Analysis of Heat Exchanger Cycle Using Non-Parametric Method
Authors: Apurv Kulkarni, Shreyas Badave, B. Rajiv
Abstract:
Non-parametric reliability technique is useful for assessment of reliability of systems for which failure rates are not available. This is useful when detection of malfunctioning of any component is the key purpose during ongoing operation of the system. The main purpose of the Heat Exchanger Cycle discussed in this paper is to provide hot water at a constant temperature for longer periods of time. In such a cycle, certain components play a crucial role and this paper presents an effective way to predict the malfunctioning of the components by determination of system reliability. The method discussed in the paper is feasible and this is clarified with the help of various test cases.Keywords: heat exchanger cycle, k-statistics, PID controller, system reliability
Procedia PDF Downloads 39017131 Modeling Dynamics and Control of Transversal Vibration of an Underactuated Flexible Plate Using Controlled Lagrangian Method
Authors: Mahmood Khalghollah, Mohammad Tavallaeinejad, Mohammad Eghtesad
Abstract:
The method of Controlled Lagrangian is an energy shaping control technique for under actuated Lagrangian systems. Energy shaping control design methods are appealing as they retain the underlying nonlinear dynamics and can provide stability results that hold over larger domain than can be obtained using linear design and analysis. In the present study, controlled lagrangian is employed for designing a controller in an under actuated rotating flexible plate system. In the system of rotating flexible plate, due to its nonlinear characteristics and coupled dynamics of rigid and flexible components, controller design is a known challenge. In this paper, controller objectives are considered to be vibration reduction of flexible component and position control of the tip of the plate. To achieve the goals, a method based on both kinetic and potential energy shaping is introduced. The stability of the closed-loop system is investigated and proved around its equilibrium points. Moreover, the proposed controller is shown to be robust against disturbance and plant uncertainties.Keywords: controlled lagrangian, underactuated system, flexible rotating plate, disturbance
Procedia PDF Downloads 44617130 Integrated Best Worst PROMETHEE to Evaluate Public Transport Service Quality
Authors: Laila Oubahman, Duleba Szabolcs
Abstract:
Public transport stakeholders aim to increase the ridership ratio by encouraging citizens to use common transportation modes. For this sight, improving service quality is a crucial option to reach the quality desired by users and reduce the gap between desired and perceived quality. Multi-criteria decision aid has been applied in literature in recent decades because it provides efficient models to assess the most impacting criteria on the overall assessment. In this paper, the PROMETHEE method is combined with the best-worst approach to construct a consensual model that avoids rank reversal to support stakeholders in ameliorating service quality.Keywords: best-worst method, MCDA, PROMETHEE, public transport
Procedia PDF Downloads 20817129 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 8017128 Combination of Topology and Rough Set for Analysis of Power System Control
Authors: M. Kamel El-Sayed
Abstract:
In this research, we have linked the concept of rough set and topological structure to the creation of a new topological structure that assists in the analysis of the information systems of some electrical engineering issues. We used non-specific information whose boundaries do not have an empty set in the top topological structure is rough set. It is characterized by the fact that it does not contain a large number of elements and facilitates the establishment of rules. We used this structure in reducing the specifications of electrical information systems. We have provided a detailed example of this method illustrating the steps used. This method opens the door to obtaining multiple topologies, each of which uses one of the non-defined groups (rough set) in the overall information system.Keywords: electrical engineering, information system, rough set, rough topology, topology
Procedia PDF Downloads 45417127 Macroeconomic Implications of Artificial Intelligence on Unemployment in Europe
Authors: Ahmad Haidar
Abstract:
Modern economic systems are characterized by growing complexity, and addressing their challenges requires innovative approaches. This study examines the implications of artificial intelligence (AI) on unemployment in Europe from a macroeconomic perspective, employing data modeling techniques to understand the relationship between AI integration and labor market dynamics. To understand the AI-unemployment nexus comprehensively, this research considers factors such as sector-specific AI adoption, skill requirements, workforce demographics, and geographical disparities. The study utilizes a panel data model, incorporating data from European countries over the last two decades, to explore the potential short-term and long-term effects of AI implementation on unemployment rates. In addition to investigating the direct impact of AI on unemployment, the study also delves into the potential indirect effects and spillover consequences. It considers how AI-driven productivity improvements and cost reductions might influence economic growth and, in turn, labor market outcomes. Furthermore, it assesses the potential for AI-induced changes in industrial structures to affect job displacement and creation. The research also highlights the importance of policy responses in mitigating potential negative consequences of AI adoption on unemployment. It emphasizes the need for targeted interventions such as skill development programs, labor market regulations, and social safety nets to enable a smooth transition for workers affected by AI-related job displacement. Additionally, the study explores the potential role of AI in informing and transforming policy-making to ensure more effective and agile responses to labor market challenges. In conclusion, this study provides a comprehensive analysis of the macroeconomic implications of AI on unemployment in Europe, highlighting the importance of understanding the nuanced relationships between AI adoption, economic growth, and labor market outcomes. By shedding light on these relationships, the study contributes valuable insights for policymakers, educators, and researchers, enabling them to make informed decisions in navigating the complex landscape of AI-driven economic transformation.Keywords: artificial intelligence, unemployment, macroeconomic analysis, european labor market
Procedia PDF Downloads 7717126 Iterative Solver for Solving Large-Scale Frictional Contact Problems
Authors: Thierno Diop, Michel Fortin, Jean Deteix
Abstract:
Since the precise formulation of the elastic part is irrelevant for the description of the algorithm, we shall consider a generic case. In practice, however, we will have to deal with a non linear material (for instance a Mooney-Rivlin model). We are interested in solving a finite element approximation of the problem, leading to large-scale non linear discrete problems and, after linearization, to large linear systems and ultimately to calculations needing iterative methods. This also implies that penalty method, and therefore augmented Lagrangian method, are to be banned because of their negative effect on the condition number of the underlying discrete systems and thus on the convergence of iterative methods. This is in rupture to the mainstream of methods for contact in which augmented Lagrangian is the principal tool. We shall first present the problem and its discretization; this will lead us to describe a general solution algorithm relying on a preconditioner for saddle-point problems which we shall describe in some detail as it is not entirely standard. We will propose an iterative approach for solving three-dimensional frictional contact problems between elastic bodies, including contact with a rigid body, contact between two or more bodies and also self-contact.Keywords: frictional contact, three-dimensional, large-scale, iterative method
Procedia PDF Downloads 21017125 Distribution and Population Status of Canis spp. Threats and Conservation in Lehri Nature Park, Salt Range, District Jhelum
Authors: Muhammad Saad, AzherBaig, Anwar Maqsood, Muhammad Waseem
Abstract:
The grey wolf has been ranked endangered and Asiatic jackal as near threatened in Pakistan. Scientific data on population and threats to these species are not available in Pakistan, which is required for their proper management and conservation. The present study was conducted to collect data on distribution range, population status and threats to both of these Canis species in Lehri Nature Park. The data were collected using direct observations and indirect signs in the field. The population of grey wolf and Asiatic jackal were scattered into pocket of the study area and its surroundings. The current population of grey wolf was estimated 06 individuals and that of Asiatic jackal 28 individuals in the study area. The present study showed that grey wolf and Asiatic jackal were distributed in the northern and southern part of the study area having dense vegetation cover of tress and shrub between the altitudes of 330 m and 515 m. The research finding revealed that the scrub forest is the most preferred habitat of both the species but due to anthropogenic pressure the scrub forest is under severe threat. The dominant trees species were Acacia modesta, Zizyphus nummularia, and Prosopis juliflora and shrubs species of Dodonea-viscosa, Calotropis procera and Adhatoda vasica. Urial is one of the natural prey species: their population is low due to a number of reasons and therefore the maximum dependence of the wolves was on the livestock of the local and nomadic shepherds. The main prey species in the livestock was goats and sheep. The interviews were conducted with the eye witnesses of wolf attacks including livestock being killed by 5-6 numbers of wolves in different hamlets in the study area. The killing rate of the livestock by the wolves was greater when the nomadic shepherds were present in the area and decreased when they left the area. Presence of nomadic shepherds and killing rate has relation with the shifting of the wolves from the study area. It is further concluded that the population of the grey wolf and Asiatic jackal has decreased over time due to less availability of the natural prey species and habitat destruction.Keywords: wildlife ecology, population conservation, rehabilitation, conservation
Procedia PDF Downloads 50117124 Ge₁₋ₓSnₓ Alloys with Tuneable Energy Band Gap on GaAs (100) Substrate Manufactured by a Modified Magnetron Co-Sputtering
Authors: Li Qian, Jinchao Tong, Daohua Zhang, Weijun Fan, Fei Suo
Abstract:
Photonic applications based on group IV semiconductors have always been an interest but also a challenge for the research community. We report manufacturing group IV Ge₁₋ₓSnₓ alloys with tuneable energy band gap on (100) GaAs substrate by a modified radio frequency magnetron co-sputtering. Images were taken by atomic force microscope, and scanning electron microscope clearly demonstrates a smooth surface profile, and Ge₁₋ₓSnₓ nano clusters are with the size of several tens of nanometers. Transmittance spectra were measured by Fourier Transform Infrared Spectroscopy that showed changing energy gaps with the variation in elementary composition. Calculation results by 8-band k.p method are consistent with measured gaps. Our deposition system realized direct growth of Ge₁₋ₓSnₓ thin film on GaAs (100) substrate by sputtering. This simple deposition method was modified to be able to grow high-quality photonic materials with tuneable energy gaps. This work provides an alternative and successful method for fabricating Group IV photonic semiconductor materials.Keywords: GeSn, crystal growth, sputtering, photonic
Procedia PDF Downloads 14417123 Ray Tracing Modified 3D Image Method Simulation of Picocellular Propagation Channel Environment
Authors: Fathi Alwafie
Abstract:
In this paper we present the simulation of the propagation characteristics of the picocellular propagation channel environment. The first aim has been to find a correct description of the environment for received wave. The result of the first investigations is that the environment of the indoor wave significantly changes as we change the electric parameters of material constructions. A modified 3D ray tracing image method tool has been utilized for the coverage prediction. A detailed analysis of the dependence of the indoor wave on the wide-band characteristics of the channel: Root Mean Square (RMS) delay spread characteristics and mean excess delay, is also investigated.Keywords: propagation, ray tracing, network, mobile computing
Procedia PDF Downloads 40017122 OCR/ICR Text Recognition Using ABBYY FineReader as an Example Text
Authors: A. R. Bagirzade, A. Sh. Najafova, S. M. Yessirkepova, E. S. Albert
Abstract:
This article describes a text recognition method based on Optical Character Recognition (OCR). The features of the OCR method were examined using the ABBYY FineReader program. It describes automatic text recognition in images. OCR is necessary because optical input devices can only transmit raster graphics as a result. Text recognition describes the task of recognizing letters shown as such, to identify and assign them an assigned numerical value in accordance with the usual text encoding (ASCII, Unicode). The peculiarity of this study conducted by the authors using the example of the ABBYY FineReader, was confirmed and shown in practice, the improvement of digital text recognition platforms developed by Electronic Publication.Keywords: ABBYY FineReader system, algorithm symbol recognition, OCR/ICR techniques, recognition technologies
Procedia PDF Downloads 16817121 Fuzzy Multi-Criteria Decision-Making Based on Ignatian Discernment Process
Authors: Pathinathan Theresanathan, Ajay Minj
Abstract:
Ignatian Discernment Process (IDP) is an intense decision-making tool to decide on life-issues. Decisions are influenced by various factors outside of the decision maker and inclination within. This paper develops IDP in the context of Fuzzy Multi-criteria Decision Making (FMCDM) process. Extended VIKOR method is a decision-making method which encompasses even conflict situations and accommodates weightage to various issues. Various aspects of IDP, namely three ways of decision making and tactics of inner desires, are observed, analyzed and articulated within the frame work of fuzzy rules. The decision-making situations are broadly categorized into two types. The issues outside of the decision maker influence the person. The inner feeling also plays vital role in coming to a conclusion. IDP integrates both the categories using Extended VIKOR method. Case studies are carried out and analyzed with FMCDM process. Finally, IDP is verified with an illustrative case study and results are interpreted. A confused person who could not come to a conclusion is able to take decision on a concrete way of life through IDP. The proposed IDP model recommends an integrated and committed approach to value-based decision making.Keywords: AHP, FMCDM, IDP, ignatian discernment, MCDM, VIKOR
Procedia PDF Downloads 26017120 Segmentation Using Multi-Thresholded Sobel Images: Application to the Separation of Stuck Pollen Grains
Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinie
Abstract:
Being able to identify biological particles such as spores, viruses, or pollens is important for health care professionals, as it allows for appropriate therapeutic management of patients. Optical microscopy is a technology widely used for the analysis of these types of microorganisms, because, compared to other types of microscopy, it is not expensive. The analysis of an optical microscope slide is a tedious and time-consuming task when done manually. However, using machine learning and computer vision, this process can be automated. The first step of an automated microscope slide image analysis process is segmentation. During this step, the biological particles are localized and extracted. Very often, the use of an automatic thresholding method is sufficient to locate and extract the particles. However, in some cases, the particles are not extracted individually because they are stuck to other biological elements. In this paper, we propose a stuck particles separation method based on the use of the Sobel operator and thresholding. We illustrate it by applying it to the separation of 813 images of adjacent pollen grains. The method correctly separated 95.4% of these images.Keywords: image segmentation, stuck particles separation, Sobel operator, thresholding
Procedia PDF Downloads 13017119 Study on Seismic Assessment of Earthquake-Damaged Reinforced Concrete Buildings
Authors: Fu-Pei Hsiao, Fung-Chung Tu, Chien-Kuo Chiu
Abstract:
In this work, to develop a method for detailed assesses of post-earthquake seismic performance for RC buildings in Taiwan, experimental data for several column specimens with various failure modes (flexural failure, flexural-shear failure, and shear failure) are used to derive reduction factors of seismic capacity for specified damage states. According to the damage states of RC columns and their corresponding seismic reduction factors suggested by experimental data, this work applies the detailed seismic performance assessment method to identify the seismic capacity of earthquake-damaged RC buildings. Additionally, a post-earthquake emergent assessment procedure is proposed that can provide the data needed for decision about earthquake-damaged buildings in a region with high seismic hazard. Finally, three actual earthquake-damaged school buildings in Taiwan are used as a case study to demonstrate application of the proposed assessment method.Keywords: seismic assessment, seismic reduction factor, residual seismic ratio, post-earthquake, reinforced concrete, building
Procedia PDF Downloads 40017118 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities
Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos
Abstract:
The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification
Procedia PDF Downloads 47617117 Behavior of Beam-Column Nodes Reinforced Concrete in Earthquake Zones
Authors: Zaidour Mohamed, Ghalem Ali Jr., Achit Henni Mohamed
Abstract:
This project is destined to study pole junctions of reinforced concrete beams subjected to seismic loads. A literature review was made to clarify the work done by researchers in the last three decades and especially the results of the last two years that were studied for the determination of the method of calculating the transverse reinforcement in the different nodes of a structure. For implementation efforts in the columns and beams of a building R + 4 in zone 3 were calculated using the finite element method through software. These results are the basis of our work which led to the calculation of the transverse reinforcement of the nodes of the structure in question.Keywords: beam–column joints, cyclic loading, shearing force, damaged joint
Procedia PDF Downloads 55017116 Non-Invasive Imaging of Tissue Using Near Infrared Radiations
Authors: Ashwani Kumar Aggarwal
Abstract:
NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering
Procedia PDF Downloads 31617115 Petra: Simplified, Scalable Verification Using an Object-Oriented, Compositional Process Calculus
Authors: Aran Hakki, Corina Cirstea, Julian Rathke
Abstract:
Formal methods are yet to be utilized in mainstream software development due to issues in scaling and implementation costs. This work is about developing a scalable, simplified, pragmatic, formal software development method with strong correctness properties and guarantees that are easy prove. The method aims to be easy to learn, use and apply without extensive training and experience in formal methods. Petra is proposed as an object-oriented, process calculus with composable data types and sequential/parallel processes. Petra has a simple denotational semantics, which includes a definition of Correct by Construction. The aim is for Petra is to be standard which can be implemented to execute on various mainstream programming platforms such as Java. Work towards an implementation of Petra as a Java EDSL (Embedded Domain Specific Language) is also discussed.Keywords: compositionality, formal method, software verification, Java, denotational semantics, rewriting systems, rewriting semantics, parallel processing, object-oriented programming, OOP, programming language, correct by construction
Procedia PDF Downloads 14417114 Study of a Few Additional Posterior Projection Data to 180° Acquisition for Myocardial SPECT
Authors: Yasuyuki Takahashi, Hirotaka Shimada, Takao Kanzaki
Abstract:
A Dual-detector SPECT system is widely by use of myocardial SPECT studies. With 180-degree (180°) acquisition, reconstructed images are distorted in the posterior wall of myocardium due to the lack of sufficient data of posterior projection. We hypothesized that quality of myocardial SPECT images can be improved by the addition of data acquisition of only a few posterior projections to ordinary 180° acquisition. The proposed acquisition method (180° plus acquisition methods) uses the dual-detector SPECT system with a pair of detector arranged in 90° perpendicular. Sampling angle was 5°, and the acquisition range was 180° from 45° right anterior oblique to 45° left posterior oblique. After the acquisition of 180°, the detector moved to additional acquisition position of reverse side once for 2 projections, twice for 4 projections, or 3 times for 6 projections. Since these acquisition methods cannot be done in the present system, actual data acquisition was done by 360° with a sampling angle of 5°, and projection data corresponding to above acquisition position were extracted for reconstruction. We underwent the phantom studies and a clinical study. SPECT images were compared by profile curve analysis and also quantitatively by contrast ratio. The distortion was improved by 180° plus method. Profile curve analysis showed increased of cardiac cavity. Analysis with contrast ratio revealed that SPECT images of the phantoms and the clinical study were improved from 180° acquisition by the present methods. The difference in the contrast was not clearly recognized between 180° plus 2 projections, 180° plus 4 projections, and 180° plus 6 projections. 180° plus 2 projections method may be feasible for myocardial SPECT because distortion of the image and the contrast were improved.Keywords: 180° plus acquisition method, a few posterior projections, dual-detector SPECT system, myocardial SPECT
Procedia PDF Downloads 29517113 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 349