Search results for: Callibration Methods
14568 Prediction of Road Accidents in Qatar by 2022
Authors: M. Abou-Amouna, A. Radwan, L. Al-kuwari, A. Hammuda, K. Al-Khalifa
Abstract:
There is growing concern over increasing incidences of road accidents and consequent loss of human life in Qatar. In light to the future planned event in Qatar, World Cup 2022; Qatar should put into consideration the future deaths caused by road accidents, and past trends should be considered to give a reasonable picture of what may happen in the future. Qatar roads should be arranged and paved in a way that accommodate high capacity of the population in that time, since then there will be a huge number of visitors from the world. Qatar should also consider the risk issues of road accidents raised in that period, and plan to maintain high level to safety strategies. According to the increase in the number of road accidents in Qatar from 1995 until 2012, an analysis of elements affecting and causing road accidents will be effectively studied. This paper aims to identify and criticize the factors that have high effect on causing road accidents in the state of Qatar, and predict the total number of road accidents in Qatar 2022. Alternative methods are discussed and the most applicable ones according to the previous researches are selected for further studies. The methods that satisfy the existing case in Qatar were the multiple linear regression model (MLR) and artificial neutral network (ANN). Those methods are analyzed and their findings are compared. We conclude that by using MLR the number of accidents in 2022 will become 355,226 accidents, and by using ANN 216,264 accidents. We conclude that MLR gave better results than ANN because the artificial neutral network doesn’t fit data with large range varieties.Keywords: road safety, prediction, accident, model, Qatar
Procedia PDF Downloads 25914567 Review on Optimization of Drinking Water Treatment Process
Authors: M. Farhaoui, M. Derraz
Abstract:
In the drinking water treatment processes, the optimization of the treatment is an issue of particular concern. In general, the process consists of many units as settling, coagulation, flocculation, sedimentation, filtration and disinfection. The optimization of the process consists of some measures to decrease the managing and monitoring expenses and improve the quality of the produced water. The objective of this study is to provide water treatment operators with methods and practices that enable to attain the most effective use of the facility and, in consequence, optimize the of the cubic meter price of the treated water. This paper proposes a review on optimization of drinking water treatment process by analyzing all of the water treatment units and gives some solutions in order to maximize the water treatment performances without compromising the water quality standards. Some solutions and methods are performed in the water treatment plant located in the middle of Morocco (Meknes).Keywords: coagulation process, optimization, turbidity removal, water treatment
Procedia PDF Downloads 42414566 Passenger Preferences on Airline Check-In Methods: Traditional Counter Check-In Versus Common-Use Self-Service Kiosk
Authors: Cruz Queen Allysa Rose, Bautista Joymeeh Anne, Lantoria Kaye, Barretto Katya Louise
Abstract:
The study presents the preferences of passengers on the quality of service provided by the two airline check-in methods currently present in airports-traditional counter check-in and common-use self-service kiosks. Since a study has shown that airlines perceive self-service kiosks alone are sufficient enough to ensure adequate services and customer satisfaction, and in contrast, agents and passengers stated that it alone is not enough and that human interaction is essential. In reference with former studies that established opposing ideas about the choice of the more favorable airline check-in method to employ, it is the purpose of this study to present a recommendation that shall somehow fill-in the gap between the conflicting ideas by means of comparing the perceived quality of service through the RATER model. Furthermore, this study discusses the major competencies present in each method which are supported by the theories–FIRO Theory of Needs upholding the importance of inclusion, control and affection, and the Queueing Theory which points out the discipline of passengers and the length of the queue line as important factors affecting quality service. The findings of the study were based on the data gathered by the researchers from selected Thomasian third year and fourth year college students currently enrolled in the first semester of the academic year 2014-2015, who have already experienced both airline check-in methods through the implication of a stratified probability sampling. The statistical treatments applied in order to interpret the data were mean, frequency, standard deviation, t-test, logistic regression and chi-square test. The final point of the study revealed that there is a greater effect in passenger preference concerning the satisfaction experienced in common-use self-service kiosks in comparison with the application of the traditional counter check-in.Keywords: traditional counter check-in, common-use self-service Kiosks, airline check-in methods
Procedia PDF Downloads 40914565 Preparation and Characterization of Nanometric Ni-Zn Ferrite via Different Methods
Authors: Ebtesam. E. Ateia, L. M. Salah, A. H. El-Bassuony
Abstract:
The aim of the presented study was the possibility of developing a nanosized material with enhanced structural properties that was suitable for many applications. Nanostructure ferrite of composition Ni0.5 Zn0.5 Cr0.1 Fe1.9 O4 were prepared by sol–gel, co-precipitation, citrate-gel, flash and oxalate precursor methods. The Structural and micro structural analysis of the investigated samples were carried out. It was observed that the lattice parameter of cubic spinel was constant, and the positions of both tetrahedral and the octahedral bands had a fixed position. The values of the lattice parameter had a significant role in determining the stoichiometric cation distribution of the composition.The average crystalline sizes of the investigated samples were from 16.4 to 69 nm. Discussion was made on the basis of a comparison of average crystallite size of the investigated samples, indicating that the co-precipitation method was the the effective one in producing small crystallite sized samples.Keywords: chemical preparation, ferrite, grain size, nanocomposites, sol-gel
Procedia PDF Downloads 34114564 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction
Authors: Bastien Batardière, Joon Kwon
Abstract:
For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.Keywords: convex optimization, variance reduction, adaptive algorithms, loopless
Procedia PDF Downloads 7214563 Application the Queuing Theory in the Warehouse Optimization
Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova
Abstract:
The aim of optimization of store management is not only designing the situation of store management itself including its equipment, technology and operation. In optimization of store management we need to consider also synchronizing of technological, transport, store and service operations throughout the whole process of logistic chain in such a way that a natural flow of material from provider to consumer will be achieved the shortest possible way, in the shortest possible time in requested quality and quantity and with minimum costs. The paper deals with the application of the queuing theory for optimization of warehouse processes. The first part refers to common information about the problematic of warehousing and using mathematical methods for logistics chains optimization. The second part refers to preparing a model of a warehouse within queuing theory. The conclusion of the paper includes two examples of using queuing theory in praxis.Keywords: queuing theory, logistics system, mathematical methods, warehouse optimization
Procedia PDF Downloads 59414562 Performance Evaluation of Contemporary Classifiers for Automatic Detection of Epileptic EEG
Authors: K. E. Ch. Vidyasagar, M. Moghavvemi, T. S. S. T. Prabhat
Abstract:
Epilepsy is a global problem, and with seizures eluding even the smartest of diagnoses a requirement for automatic detection of the same using electroencephalogram (EEG) would have a huge impact in diagnosis of the disorder. Among a multitude of methods for automatic epilepsy detection, one should find the best method out, based on accuracy, for classification. This paper reasons out, and rationalizes, the best methods for classification. Accuracy is based on the classifier, and thus this paper discusses classifiers like quadratic discriminant analysis (QDA), classification and regression tree (CART), support vector machine (SVM), naive Bayes classifier (NBC), linear discriminant analysis (LDA), K-nearest neighbor (KNN) and artificial neural networks (ANN). Results show that ANN is the most accurate of all the above stated classifiers with 97.7% accuracy, 97.25% specificity and 98.28% sensitivity in its merit. This is followed closely by SVM with 1% variation in result. These results would certainly help researchers choose the best classifier for detection of epilepsy.Keywords: classification, seizure, KNN, SVM, LDA, ANN, epilepsy
Procedia PDF Downloads 52414561 Screening and Optimization of Pretreatments for Rice Straw and Their Utilization for Bioethanol Production Using Developed Yeast Strain
Authors: Ganesh Dattatraya Saratale, Min Kyu Oh
Abstract:
Rice straw is one of the most abundant lignocellulosic waste materials and its annual production is about 731 Mt in the world. This study treats the subject of effective utilization of this waste biomass for biofuels production. We have showed a comparative assessment of numerous pretreatment strategies for rice straw, comprising of major physical, chemical and physicochemical methods. Among the different methods employed for pretreatment alkaline pretreatment in combination with sodium chlorite/acetic acid delignification found efficient pretreatment with significant improvement in the enzymatic digestibility of rice straw. A cellulase dose of 20 filter paper units (FPU) released a maximum 63.21 g/L of reducing sugar with 94.45% hydrolysis yield and 64.64% glucose yield from rice straw, respectively. The effects of different pretreatment methods on biomass structure and complexity were investigated by FTIR, XRD and SEM analytical techniques. Finally the enzymatic hydrolysate of rice straw was used for ethanol production using developed Saccharomyces cerevisiae SR8. The developed yeast strain enabled efficient fermentation of xylose and glucose and produced higher ethanol production. Thus development of bioethanol production from lignocellulosic waste biomass is generic, applicable methodology and have great implication for using ‘green raw materials’ and producing ‘green products’ much needed today.Keywords: rice straw, pretreatment, enzymatic hydrolysis, FPU, Saccharomyces cerevisiae SR8, ethanol fermentation
Procedia PDF Downloads 54014560 Environmental Exposure Assessment among Refuellers at Brussels South Charleroi Airport
Authors: Mostosi C., Stéphenne J., Kempeneers E.
Abstract:
Introduction: Refuellers from Brussels South Charleroi Airport (BSCA) expressed concerns about the risks involved in handling JET-A1 fuel. The HSE Manager of BSCA, in collaboration with the occupational physician and the industrial hygiene unit of the External Service of Occupational Medicine, decided to assess the toxicological exposure of these workers. Materials and methods: Two measurement methods were used. The first was to assay three types of metabolites in urine to highlight the exposure to xylenes, toluene, and benzene in aircraft fuels. Out of 32 refuellers in the department, 26 participated in the sampling, and 23 samples were exploited. The second method targeted the assessment of environmental exposure to certain potentially hazardous substances that refuellers are likely to breathe in work areas at the airport. It was decided to carry out two ambient air measurement campaigns, using static systems on the one hand and, on the other hand, using individual sensors worn by the refuellers at the level of the respiratory tract. Volatile organic compounds and diesel particles were analyzed. Results: Despite the fears that motivated these analyzes, the overall results showed low levels of exposure, far below the existing limit values, both in air quality and in urinary measurements. Conclusion: These results are comparable to a study carried out in several French airports. The staff could be reassured, and then the medical surveillance was modified by the occupational physician. With the aviation development at BSCA, equipment and methods are evolving. Their exposure will have to be reassessed.Keywords: refuelling, airport, exposure, fuel, occupational health, air quality
Procedia PDF Downloads 8614559 Novel Aminoglycosides to Target Resistant Pathogens
Authors: Nihar Ranjan, Derrick Watkins, Dev P. Arya
Abstract:
Current methods in the study of antibiotic activity of ribosome targeted antibiotics are dependent on cell based bacterial inhibition assays or various forms of ribosomal binding assays. These assays are typically independent of each other and little direct correlation between the ribosomal binding and bacterial inhibition is established with the complementary assay. We have developed novel high-throughput capable assays for ribosome targeted drug discovery. One such assay examines the compounds ability to bind to a model ribosomal RNA A-site. We have also coupled this assay to other functional orthogonal assays. Such analysis can provide valuable understanding of the relationships between two complementary drug screening methods and could be used as standard analysis to correlate the affinity of a compound for its target and the effect the compound has on a cell.Keywords: bacterial resistance, aminoglycosides, screening, drugs
Procedia PDF Downloads 37214558 Hybrid Approach for the Min-Interference Frequency Assignment
Authors: F. Debbat, F. T. Bendimerad
Abstract:
The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing
Procedia PDF Downloads 38714557 Circular Labour Migration and Its Consequences in Georgia
Authors: Manana Lobzhanidze
Abstract:
Introduction: The paper will argue that labor migration is the most important problem Georgia faces today. The structure of labor migration by age and gender of Georgia is analyzed. The main driving factors of circular labor migration during the last ten years are identified. While studying migration, it is necessary to discuss the interconnection of economic, social, and demographic features, also taking into consideration the policy of state regulations in terms of education and professional training. Methodology: Different research methods are applied in the presented paper: statistical, such as selection, grouping, observation, trend, and qualitative research methods, namely; analysis, synthesis, induction, deduction, comparison ones. Main Findings: Labour migrants are filling the labor market as a low salary worker. The main positive feedback of migration from developing countries is poverty eradication, but this process is accompanied by problems, such as 'Brain Drain'. The country loses an important part of its intellectual potential, and it is invested by households or state itself. Conclusions: Labor migration is characterized to be temporary, but socio-economic problems of the country often push the labor migration in the direction of longterm and illegal migration. Countries with developed economies try to stricter migration policy and fight illegal migration with different methods; circular migration helps solve this problem. Conclusions and recommendations are included about circular labor migration consequences in Georgia and its influence on the reduction of unemployment level.Keywords: migration, circular labor migration, labor migration employment, unemployment
Procedia PDF Downloads 18014556 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments
Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda
Abstract:
In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction
Procedia PDF Downloads 51414555 Generating Product Description with Generative Pre-Trained Transformer 2
Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen
Abstract:
Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining
Procedia PDF Downloads 19914554 Forecasting Residential Water Consumption in Hamilton, New Zealand
Authors: Farnaz Farhangi
Abstract:
Many people in New Zealand believe that the access to water is inexhaustible, and it comes from a history of virtually unrestricted access to it. For the region like Hamilton which is one of New Zealand’s fastest growing cities, it is crucial for policy makers to know about the future water consumption and implementation of rules and regulation such as universal water metering. Hamilton residents use water freely and they do not have any idea about how much water they use. Hence, one of proposed objectives of this research is focusing on forecasting water consumption using different methods. Residential water consumption time series exhibits seasonal and trend variations. Seasonality is the pattern caused by repeating events such as weather conditions in summer and winter, public holidays, etc. The problem with this seasonal fluctuation is that, it dominates other time series components and makes difficulties in determining other variations (such as educational campaign’s effect, regulation, etc.) in time series. Apart from seasonality, a stochastic trend is also combined with seasonality and makes different effects on results of forecasting. According to the forecasting literature, preprocessing (de-trending and de-seasonalization) is essential to have more performed forecasting results, while some other researchers mention that seasonally non-adjusted data should be used. Hence, I answer the question that is pre-processing essential? A wide range of forecasting methods exists with different pros and cons. In this research, I apply double seasonal ARIMA and Artificial Neural Network (ANN), considering diverse elements such as seasonality and calendar effects (public and school holidays) and combine their results to find the best predicted values. My hypothesis is the examination the results of combined method (hybrid model) and individual methods and comparing the accuracy and robustness. In order to use ARIMA, the data should be stationary. Also, ANN has successful forecasting applications in terms of forecasting seasonal and trend time series. Using a hybrid model is a way to improve the accuracy of the methods. Due to the fact that water demand is dominated by different seasonality, in order to find their sensitivity to weather conditions or calendar effects or other seasonal patterns, I combine different methods. The advantage of this combination is reduction of errors by averaging of each individual model. It is also useful when we are not sure about the accuracy of each forecasting model and it can ease the problem of model selection. Using daily residential water consumption data from January 2000 to July 2015 in Hamilton, I indicate how prediction by different methods varies. ANN has more accurate forecasting results than other method and preprocessing is essential when we use seasonal time series. Using hybrid model reduces forecasting average errors and increases the performance.Keywords: artificial neural network (ANN), double seasonal ARIMA, forecasting, hybrid model
Procedia PDF Downloads 33914553 Statistical Analysis to Select Evacuation Route
Authors: Zaky Musyarof, Dwi Yono Sutarto, Dwima Rindy Atika, R. B. Fajriya Hakim
Abstract:
Each country should be responsible for the safety of people, especially responsible for the safety of people living in disaster-prone areas. One of those services is provides evacuation route for them. But all this time, the selection of evacuation route is seem doesn’t well organized, it could be seen that when a disaster happen, there will be many accumulation of people on the steps of evacuation route. That condition is dangerous to people because hampers evacuation process. By some methods in Statistical analysis, author tries to give a suggestion how to prepare evacuation route which is organized and based on people habit. Those methods are association rules, sequential pattern mining, hierarchical cluster analysis and fuzzy logic.Keywords: association rules, sequential pattern mining, cluster analysis, fuzzy logic, evacuation route
Procedia PDF Downloads 50414552 Challenges in the Material and Action-Resistance Factor Design for Embedded Retaining Wall Limit State Analysis
Authors: Kreso Ivandic, Filip Dodigovic, Damir Stuhec
Abstract:
The paper deals with the proposed 'Material' and 'Action-resistance factor' design methods in designing the embedded retaining walls. The parametric analysis of evaluating the differences of the output values mutually and compared with classic approach computation was performed. There is a challenge with the criteria for choosing the proposed calculation design methods in Eurocode 7 with respect to current technical regulations and regular engineering practice. The basic criterion for applying a particular design method is to ensure minimum an equal degree of reliability in relation to the current practice. The procedure of combining the relevant partial coefficients according to design methods was carried out. The use of mentioned partial coefficients should result in the same level of safety, regardless of load combinations, material characteristics and problem geometry. This proposed approach of the partial coefficients related to the material and/or action-resistance should aimed at building a bridge between calculations used so far and pure probability analysis. The measure to compare the results was to determine an equivalent safety factor for each analysis. The results show a visible wide span of equivalent values of the classic safety factors.Keywords: action-resistance factor design, classic approach, embedded retaining wall, Eurocode 7, limit states, material factor design
Procedia PDF Downloads 23214551 Comparison of Cyclone Design Methods for Removal of Fine Particles from Plasma Generated Syngas
Authors: Mareli Hattingh, I. Jaco Van der Walt, Frans B. Waanders
Abstract:
A waste-to-energy plasma system was designed by Necsa for commercial use to create electricity from unsorted municipal waste. Fly ash particles must be removed from the syngas stream at operating temperatures of 1000 °C and recycled back into the reactor for complete combustion. A 2D2D high efficiency cyclone separator was chosen for this purpose. During this study, two cyclone design methods were explored: The Classic Empirical Method (smaller cyclone) and the Flow Characteristics Method (larger cyclone). These designs were optimized with regard to efficiency, so as to remove at minimum 90% of the fly ash particles of average size 10 μm by 50 μm. Wood was used as feed source at a concentration of 20 g/m3 syngas. The two designs were then compared at room temperature, using Perspex test units and three feed gases of different densities, namely nitrogen, helium and air. System conditions were imitated by adapting the gas feed velocity and particle load for each gas respectively. Helium, the least dense of the three gases, would simulate higher temperatures, whereas air, the densest gas, simulates a lower temperature. The average cyclone efficiencies ranged between 94.96% and 98.37%, reaching up to 99.89% in individual runs. The lowest efficiency attained was 94.00%. Furthermore, the design of the smaller cyclone proved to be more robust, while the larger cyclone demonstrated a stronger correlation between its separation efficiency and the feed temperatures. The larger cyclone can be assumed to achieve slightly higher efficiencies at elevated temperatures. However, both design methods led to good designs. At room temperature, the difference in efficiency between the two cyclones was almost negligible. At higher temperatures, however, these general tendencies are expected to be amplified so that the difference between the two design methods will become more obvious. Though the design specifications were met for both designs, the smaller cyclone is recommended as default particle separator for the plasma system due to its robust nature.Keywords: Cyclone, design, plasma, renewable energy, solid separation, waste processing
Procedia PDF Downloads 21414550 Supercritical CO2 Extraction of Cymbopogon martini Essential Oil and Comparison of Its Composition with Traditionally Extracted Oils
Authors: Aarti Singh, Anees Ahmad
Abstract:
Essential oil was extracted from lemon grass (Cymbopogon martini) with supercritical carbondioxide (SC-CO2) at pressure of 140 bar and temperature of 55 °C and CO2 flow rate of 8 gmin-1, and its composition and yield were compared with other conventional extraction methods of oil, HD (Hydrodistillation), SE (Solvent Extraction), UAE (Ultrasound Assisted Extraction). SC-CO2 extraction is a green and sustainable extraction technique. Each oil was analysed by GC-MS, the major constituents were neral (44%), Z-citral (43%), geranial (27%), caryophyllene (4.6%) and linalool (1%). The essential oil of lemon grass is valued for its neral and citral concentration. The oil obtained by supercritical carbon-dioxide extraction contained maximum concentration of neral (55.05%) whereas ultrasonication extracted oil contained minimum content (5.24%) and it was absent in solvent extracted oil. The antioxidant properties have been assessed by DPPH and superoxide scavenging methods.Keywords: cymbopogon martini, essential oil, FT-IR, GC-MS, HPTLC, SC-CO2
Procedia PDF Downloads 46314549 Outsourcing the Front End of Innovation
Abstract:
The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.Keywords: creativity, distance learning, front end, innovation, problem
Procedia PDF Downloads 33114548 Economic Valuation of Environmental Services Sustained by Flamboyant Park in Goiania-Go, Brazil
Authors: Brenda R. Berca, Jessica S. Vieira, Lucas G. Candido, Matheus C. Ferreira, Paulo S. A. Lopes Filho, Rafaella O. Baracho
Abstract:
This study aims to estimate the economic value environmental services sustained by Flamboyant Lourival Louza Municipal Park in Goiânia, Goiás, Brazil. The Flamboyant Park is one of the most relevant urban parks, and it is located near a stadium, a shopping center, and two supercenters. In order to define the methods used for the valuation of Flamboyant Park, the first step was carrying out bibliographical research with the view to better understand which method is most feasible to valuate the Park. Thus, the following direct methods were selected: travel cost, hedonic pricing, and contingent valuation. In addition, an indirect method (replacement cost) was applied at Flamboyant Park. The second step was creating and applying two surveys. The first survey aimed at the visitors of the park, addressing socio-economic issues, the use of the Park, as well as its importance and the willingness the visitors, had to pay for its existence. The second survey was destined to the existing trade in the Park, in order to collect data regarding the profits obtained by them. In the end, the characterization of the profile of the visitors and the application of the methods of contingent valuation, travel cost, replacement cost and hedonic pricing were obtained, thus monetarily valuing the various ecosystem services sustained by the park. Some services were not valued due to difficulties encountered during the process.Keywords: contingent valuation, ecosystem services, economic environmental valuation, hedonic pricing, travel cost
Procedia PDF Downloads 22914547 Compare Hot Forming and Cold Forming in Rolling Process
Authors: Ali Moarrefzadeh
Abstract:
In metalworking, rolling is a metal forming process in which metal stock is passed through a pair of rolls. Rolling is classified according to the temperature of the metal rolled. If the temperature of the metal is above its recrystallization temperature, then the process is termed as hot rolling. If the temperature of the metal is below its recrystallization temperature, the process is termed as cold rolling. In terms of usage, hot rolling processes more tonnage than any other manufacturing process, and cold rolling processes the most tonnage out of all cold working processes. This article describes the use of advanced tubing inspection NDT methods for boiler and heat exchanger equipment in the petrochemical industry to supplement major turnaround inspections. The methods presented include remote field eddy current, magnetic flux leakage, internal rotary inspection system and eddy current.Keywords: hot forming, cold forming, metal, rolling, simulation
Procedia PDF Downloads 53214546 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker
Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation
Procedia PDF Downloads 2814545 Regularization of Gene Regulatory Networks Perturbed by White Noise
Authors: Ramazan I. Kadiev, Arcady Ponosov
Abstract:
Mathematical models of gene regulatory networks can in many cases be described by ordinary differential equations with switching nonlinearities, where the initial value problem is ill-posed. Several regularization methods are known in the case of deterministic networks, but the presence of stochastic noise leads to several technical difficulties. In the presentation, it is proposed to apply the methods of the stochastic singular perturbation theory going back to Yu. Kabanov and Yu. Pergamentshchikov. This approach is used to regularize the above ill-posed problem, which, e.g., makes it possible to design stable numerical schemes. Several examples are provided in the presentation, which support the efficiency of the suggested analysis. The method can also be of interest in other fields of biomathematics, where differential equations contain switchings, e.g., in neural field models.Keywords: ill-posed problems, singular perturbation analysis, stochastic differential equations, switching nonlinearities
Procedia PDF Downloads 19714544 The Colouration of Additive-Manufactured Polymer
Authors: Abisuga Oluwayemisi Adebola, Kerri Akiwowo, Deon de Beer, Kobus Van Der Walt
Abstract:
The convergence of additive manufacturing (AM) and traditional textile dyeing techniques has initiated innovative possibilities for improving the visual application and customization potential of 3D-printed polymer objects. Textile dyeing techniques have progressed to transform fabrics with vibrant colours and complex patterns over centuries. The layer-by-layer deposition characteristic of AM necessitates adaptations in dye application methods to ensure even colour penetration across complex surfaces. Compatibility between dye formulations and polymer matrices influences colour uptake and stability, demanding careful selection and testing of dyes for optimal results. This study investigates the development interaction between these areas, revealing the challenges and opportunities of applying textile dyeing methods to colour 3D-printed polymer materials. The method explores three innovative approaches to colour the 3D-printed polymer object: (a) Additive Manufacturing of a Prototype, (b) the traditional dyebath method, and (c) the contemporary digital sublimation technique. The results show that the layer lines inherent to AM interact with dyes differently and affect the visual outcome compared to traditional textile fibers. Skillful manipulation of textile dyeing methods and dye type used for this research reduced the appearance of these lines to achieve consistency and desirable colour outcomes. In conclusion, integrating textile dyeing techniques into colouring 3D-printed polymer materials connects historical craftsmanship with innovative manufacturing. Overcoming challenges of colour distribution, compatibility, and layer line management requires a holistic approach that blends the technical consistency of AM with the artistic sensitivity of textile dyeing. Hence, applying textile dyeing methods to 3D-printed polymers opens new dimensions of aesthetic and functional possibilities.Keywords: polymer, 3D-printing, sublimation, textile, dyeing, additive manufacturing
Procedia PDF Downloads 7014543 The Decline of Verb-Second in the History of English: Combining Historical and Theoretical Explanations for Change
Authors: Sophie Whittle
Abstract:
Prior to present day, English syntax historically exhibited an inconsistent verb-second (V2) rule, which saw the verb move to the second position in the sentence following the fronting of a type of phrase. There was a high amount of variation throughout the history of English with regard to the ordering of subject and verb, and many explanations attempting to account for this variation have been documented in previous literature. However, these attempts have been contradictory, with many accounts positing the effect of previous syntactic changes as the main motivations behind the decline of V2. For instance, morphosyntactic changes, such as the loss of clitics and the loss of empty expletives, have been loosely connected to changes in frequency for the loss of V2. The questions surrounding the development of non-V2 in English have, therefore, yet to be answered. The current paper aims to bring together a number of explanations from different linguistic fields to determine the factors driving the changes in English V2. Using historical corpus-based methods, the study analyses both quantitatively and qualitatively the changes in frequency for the history of V2 in the Old, Middle, and Modern English periods to account for the variation in a range of sentential environments. These methods delve into the study of information structure, prosody and language contact to explain variation within different contexts. The analysis concludes that these factors, in addition to changes within the syntax, are responsible for the position of verb movement. The loss of V2 serves as an exemplar study within the field of historical linguistics, which combines a number of factors in explaining language change in general.Keywords: corpora, English, language change, mixed-methods, syntax, verb-second
Procedia PDF Downloads 13614542 Structural Changes Induced in Graphene Oxide Film by Low Energy Ion Beam Irradiation
Authors: Chetna Tyagi, Ambuj Tripathi, Devesh Avasthi
Abstract:
Graphene oxide consists of sp³ hybridization along with sp² hybridization due to the presence of different oxygen-containing functional groups on its edges and basal planes. However, its sp³ / sp² hybridization can be tuned by various methods to utilize it in different applications, like transistors, solar cells and biosensors. Ion beam irradiation can also be one of the methods to optimize sp² and sp³ hybridization ratio for its desirable properties. In this work, graphene oxide films were irradiated with 100 keV Argon ions at different fluences varying from 10¹³ to 10¹⁶ ions/cm². Synchrotron X-ray diffraction measurements showed an increase in crystallinity at the low fluence of 10¹³ ions/cm². Raman spectroscopy performed on irradiated samples determined the defects induced by the ion beam qualitatively. Also, identification of different groups and their removal with different fluences was done using Fourier infrared spectroscopy technique.Keywords: graphene oxide, ion beam irradiation, spectroscopy, X-ray diffraction
Procedia PDF Downloads 13714541 Julia-Based Computational Tool for Composite System Reliability Assessment
Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris
Abstract:
The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow
Procedia PDF Downloads 7614540 Comparative Survival Rates of Yeasts during Freeze-Drying, Traditional Drying and Spray Drying
Authors: Latifa Hamoudi-Belarbi, L'Hadi Nouri, Khaled Belkacemi
Abstract:
The effect of three methods of drying (traditional drying, freeze-drying and spray-drying) on the survival of concentrated cultures of Geotrichum fragrans and Wickerhamomyces anomalus was studied. The survival of yeast cultures was initially compared immediately after freeze-drying using HES 12%(w/v)+Sucrose 7% (w/v) as protectant, traditional drying in dry rice cakes and finally spray-drying with whey proteins. The survival of G. fragrans and W. anomalus was studied during 4 months of storage at 4°C and 25°C, in the darkness, under vacuum and at 0% relative humidity. The results demonstrated that high survival was obtained using traditional method of preservation in rice cakes (60% for G. fragrans and 65% for W. anomalus) and freeze-drying in (68% for G. fragrans and 74% for W. anomalus). However, poor survival was obtained by spray-drying method in whey protein with 20% for G. fragrans and 29% for W. anomalus. During storage at 25°C, yeast cultures of G. fragrans and W. anomalus preserved by traditional and freeze-drying methods showed no significant loss of viable cells up to 3 months of storage. Spray-dried yeast cultures had the greatest loss of viable count during the 4 months of storage at 25°C. During storage at 4°C, preservation of yeasts cultures using traditional method of preservation provided better survival than freeze-drying. This study demonstrated the effectiveness of the traditional method to preserve yeasts cultures compared to the high cost methods like freeze-drying and spray-drying.Keywords: freeze-drying, traditional drying, spray drying, yeasts
Procedia PDF Downloads 49214539 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V
Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev
Abstract:
Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering
Procedia PDF Downloads 236