Search results for: integrated models of reading comprehension
9767 Fixed-Bed Column Studies of Green Malachite Removal by Use of Alginate-Encapsulated Aluminium Pillared Clay
Authors: Lazhar mouloud, Chemat Zoubida, Ouhoumna Faiza
Abstract:
The main objective of this study, concerns the modeling of breakthrough curves obtained in the adsorption column of malachite green into alginate-encapsulated aluminium pillared clay in fixed bed according to various operating parameters such as the initial concentration, the feed rate and the height fixed bed, applying mathematical models namely: the model of Bohart and Adams, Wolborska, Bed Depth Service Time, Clark and Yoon-Nelson. These models allow us to express the different parameters controlling the performance of the dynamic adsorption system. The results have shown that all models were found suitable for describing the whole or a definite part of the dynamic behavior of the column with respect to the flow rate, the inlet dye concentration and the height of fixed bed.Keywords: adsorption column, malachite green, pillared clays, alginate, modeling, mathematic models, encapsulation.
Procedia PDF Downloads 5089766 A Memristive Device with Intrinsic Rectification Behavior and Performace of Crossbar Arrays
Authors: Yansong Gao, Damith C.Ranasinghe, Siad F. Al-Sarawi, Omid Kavehei, Derek Abbott
Abstract:
Passive crossbar arrays is in principle the simplest functional electrical circuit, together with memristive device in cross-point, holding great promise in future high-density, non-volatile memories. However, the greatest problem of crossbar array is the sneak path current. In this paper, we investigate one type of memristive device with intrinsic rectification behavior to address the sneak path currents. Firstly, a SPICE behavior model written in Verilog-A language of the memristive device is presented to fit experimental data published in literature. Next, systematic performance simulations including read margin and power consumption of crossbar array, which uses the self-rectifying memristive device as storage element at cross-point, with respect to different crossbar sizes, interconnect resistance, ratio of HRS/LRS (High Resistance State/ Low Resistance State), rectification ratio and different read schemes are conducted. Subsequently, Trade-offs among reading margin, power consumption, and reading schemes are analyzed to provide guidelines for circuit design. Finally, performance comparison between the memristive device with/without intrinsic rectification behavior is given to show the worthiness of this intrinsic rectification behavior.Keywords: memristive device, memristor, crossbar, RRAM, read margin, power consumption
Procedia PDF Downloads 4369765 Modelling and Simulation of Natural Gas-Fired Power Plant Integrated to a CO2 Capture Plant
Authors: Ebuwa Osagie, Chet Biliyok, Yeung Hoi
Abstract:
Regeneration energy requirement and ways to reduce it is the main aim of most CO2 capture researches currently being performed and thus, post-combustion carbon capture (PCC) option is identified to be the most suitable for the natural gas-fired power plants. From current research and development (R&D) activities worldwide, two main areas are being examined in order to reduce the regeneration energy requirement of amine-based PCC, namely: (a) development of new solvents with better overall performance than 30wt% monoethanolamine (MEA) aqueous solution, which is considered as the base-line solvent for solvent-based PCC, (b) Integration of the PCC Plant to the power plant. In scaling-up a PCC pilot plant to the size required for a commercial-scale natural gas-fired power plant, process modelling and simulation is very essential. In this work, an integrated process made up of a 482MWe natural gas-fired power plant, an MEA-based PCC plant which is developed and validated has been modelled and simulated. The PCC plant has four absorber columns and a single stripper column, the modelling and simulation was performed with Aspen Plus® V8.4. The gas turbine, the heat recovery steam generator and the steam cycle were modelled based on a 2010 US DOE report, while the MEA-based PCC plant was modelled as a rate-based process. The scaling of the amine plant was performed using a rate based calculation in preference to the equilibrium based approach for 90% CO2 capture. The power plant was integrated to the PCC plant in three ways: (i) flue gas stream from the power plant which is divided equally into four stream and each stream is fed into one of the four absorbers in the PCC plant. (ii) Steam draw-off from the IP/LP cross-over pipe in the steam cycle of the power plant used to regenerate solvent in the reboiler. (iii) Condensate returns from the reboiler to the power plant. The integration of a PCC plant to the NGCC plant resulted in a reduction of the power plant output by 73.56 MWe and the net efficiency of the integrated system is reduced by 7.3 % point efficiency. A secondary aim of this study is the parametric studies which have been performed to assess the impacts of natural gas on the overall performance of the integrated process and this is achieved through investigation of the capture efficiencies.Keywords: natural gas-fired, power plant, MEA, CO2 capture, modelling, simulation
Procedia PDF Downloads 4469764 An Improvement of a Dynamic Model of the Secondary Sedimentation Tank and Field Validation
Authors: Zahir Bakiri, Saci Nacefa
Abstract:
In this paper a comparison in made between two models, with and without dispersion term, and focused on the characterization of the movement of the sludge blanket in the secondary sedimentation tank using the solid flux theory and the velocity settling. This allowed us develop a one-dimensional models, with and without dispersion based on a thorough experimental study carried out in situ and the application of online data which are the mass load flow, transfer concentration, and influent characteristic. On the other hand, in the proposed model, the new settling velocity law (double-exponential function) used is based on the Vesilind function.Keywords: wastewater, activated sludge, sedimentation, settling velocity, settling models
Procedia PDF Downloads 3889763 Investigations into the in situ Enterococcus faecalis Biofilm Removal Efficacies of Passive and Active Sodium Hypochlorite Irrigant Delivered into Lateral Canal of a Simulated Root Canal Model
Authors: Saifalarab A. Mohmmed, Morgana E. Vianna, Jonathan C. Knowles
Abstract:
The issue of apical periodontitis has received considerable critical attention. Bacteria is integrated into communities, attached to surfaces and consequently form biofilm. The biofilm structure provides bacteria with a series protection skills against, antimicrobial agents and enhances pathogenicity (e.g. apical periodontitis). Sodium hypochlorite (NaOCl) has become the irrigant of choice for elimination of bacteria from the root canal system based on its antimicrobial findings. The aim of the study was to investigate the effect of different agitation techniques on the efficacy of 2.5% NaOCl to eliminate the biofilm from the surface of the lateral canal using the residual biofilm, and removal rate of biofilm as outcome measures. The effect of canal complexity (lateral canal) on the efficacy of the irrigation procedure was also assessed. Forty root canal models (n = 10 per group) were manufactured using 3D printing and resin materials. Each model consisted of two halves of an 18 mm length root canal with apical size 30 and taper 0.06, and a lateral canal of 3 mm length, 0.3 mm diameter located at 3 mm from the apical terminus. E. faecalis biofilms were grown on the apical 3 mm and lateral canal of the models for 10 days in Brain Heart Infusion broth. Biofilms were stained using crystal violet for visualisation. The model halves were reassembled, attached to an apparatus and tested under a fluorescence microscope. Syringe and needle irrigation protocol was performed using 9 mL of 2.5% NaOCl irrigant for 60 seconds. The irrigant was either left stagnant in the canal or activated for 30 seconds using manual (gutta-percha), sonic and ultrasonic methods. Images were then captured every second using an external camera. The percentages of residual biofilm were measured using image analysis software. The data were analysed using generalised linear mixed models. The greatest removal was associated with the ultrasonic group (66.76%) followed by sonic (45.49%), manual (43.97%), and passive irrigation group (control) (38.67%) respectively. No marked reduction in the efficiency of NaOCl to remove biofilm was found between the simple and complex anatomy models (p = 0.098). The removal efficacy of NaOCl on the biofilm was limited to the 1 mm level of the lateral canal. The agitation of NaOCl results in better penetration of the irrigant into the lateral canals. Ultrasonic agitation of NaOCl improved the removal of bacterial biofilm.Keywords: 3D printing, biofilm, root canal irrigation, sodium hypochlorite
Procedia PDF Downloads 2309762 Fully Autonomous Vertical Farm to Increase Crop Production
Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek
Abstract:
New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.Keywords: automation, vertical farming, robot, artificial intelligence, vision, control
Procedia PDF Downloads 399761 Mapping Poverty in the Philippines: Insights from Satellite Data and Spatial Econometrics
Authors: Htet Khaing Lin
Abstract:
This study explores the relationship between a diverse set of variables, encompassing both environmental and socio-economic factors, and poverty levels in the Philippines for the years 2012, 2015, and 2018. Employing Ordinary Least Squares (OLS), Spatial Lag Models (SLM), and Spatial Error Models (SEM), this study delves into the dynamics of key indicators, including daytime and nighttime land surface temperature, cropland surface, urban land surface, rainfall, population size, normalized difference water, vegetation, and drought indices. The findings reveal consistent patterns and unexpected correlations, highlighting the need for nuanced policies that address the multifaceted challenges arising from the interplay of environmental and socio-economic factors.Keywords: poverty analysis, OLS, spatial lag models, spatial error models, Philippines, google earth engine, satellite data, environmental dynamics, socio-economic factors
Procedia PDF Downloads 1029760 An Integrated Tailoring Method for Thermal Cycling Tests of Spacecraft Electronics
Authors: Xin-Yan Ji, Jing Wang, Chang Liu, Yan-Qiang Bi, Zhong-Xu Xu, Xi-Yuan Li
Abstract:
Thermal tests of electronic units are critically important for the reliability validation and performance demonstration of the spacecraft hard-wares. The tailoring equation in MIL-STD-1540 is based on fatigue of solder date. In the present paper, a new test condition tailoring expression is proposed to fit different thermo-mechanical fatigue and different subsystems, by introducing an integrated evaluating method for the fatigue acceleration exponent. The validate test has been accomplished and the data has been analyzed and compared with that from the MIL-STD-1540 tailoring equations. The results are encouraging and reasonable.Keywords: thermal cycling test, thermal fatigue, tailoring equation, test condition planning
Procedia PDF Downloads 4609759 Creativity and Innovation in Postgraduate Supervision
Authors: Rajendra Chetty
Abstract:
The paper aims to address two aspects of postgraduate studies: interdisciplinary research and creative models of supervision. Interdisciplinary research can be viewed as a key imperative to solve complex problems. While excellent research requires a context of disciplinary strength, the cutting edge is often found at the intersection between disciplines. Interdisciplinary research foregrounds a team approach and information, methodologies, designs, and theories from different disciplines are integrated to advance fundamental understanding or to solve problems whose solutions are beyond the scope of a single discipline. Our aim should also be to generate research that transcends the original disciplines i.e. transdisciplinary research. Complexity is characteristic of the knowledge economy, hence, postgraduate research and engaged scholarship should be viewed by universities as primary vehicles through which knowledge can be generated to have a meaningful impact on society. There are far too many ‘ordinary’ studies that fall into the realm of credentialism and certification as opposed to significant studies that generate new knowledge and provide a trajectory for further academic discourse. Secondly, the paper will look at models of supervision that are different to the dominant ‘apprentice’ or individual approach. A reflective practitioner approach would be used to discuss a range of supervision models that resonate well with the principles of interdisciplinarity, growth in the postgraduate sector and a commitment to engaged scholarship. The global demand for postgraduate education has resulted in increased intake and new demands to limited supervision capacity at institutions. Team supervision lodged within large-scale research projects, working with a cohort of students within a research theme, the journal article route of doctoral studies and the professional PhD are some of the models that provide an alternative to the traditional approach. International cooperation should be encouraged in the production of high-impact research and institutions should be committed to stimulating international linkages which would result in co-supervision and mobility of postgraduate students and global significance of postgraduate research. International linkages are also valuable in increasing the capacity for supervision at new and developing universities. Innovative co-supervision and joint-degree options with global partners should be explored within strategic planning for innovative postgraduate programmes. Co-supervision of PhD students is probably the strongest driver (besides funding) for collaborative research as it provides the glue of shared interest, advantage and commitment between supervisors. The students’ field serves and informs the co-supervisors own research agendas and helps to shape over-arching research themes through shared research findings.Keywords: interdisciplinarity, internationalisation, postgraduate, supervision
Procedia PDF Downloads 2389758 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data
Authors: M. A. Meslem
Abstract:
For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.Keywords: quasigeoid, gravity aomalies, covariance, GGM
Procedia PDF Downloads 1379757 Influence of Readability of Paper-Based Braille on Vertical and Horizontal Dot Spacing in Braille Beginners
Authors: K. Doi, T. Nishimura, H. Fujimoto
Abstract:
The number of people who become visually impaired and do not have sufficient tactile experiences has increased by various disease. Especially, many acquired visually impaired persons due to accidents, disorders, and aging cannot adequately read Braille. It is known that learning Braille requires a great deal of time and the acquisition of various skills. In our previous studies, we reported one of the problems in learning Braille. Concretely, the standard Braille size is too small for Braille beginners. And also we are short of the objective data regarding easily readable Braille size. Therefore, it is necessary to conduct various experiments for evaluating Braille size that would make learning easier for beginners. In this study, for the purpose of investigating easy-to-read conditions of vertical and horizontal dot spacing for beginners, we conducted one Braille reading experiment. In this our experiment, we prepared test pieces by use of our original Braille printer with controlling function of Braille size. We specifically considered Braille beginners with acquired visual impairments who were unfamiliar with Braille. Therefore, ten sighted subjects with no experience of reading Braille participated in this experiment. Size of vertical and horizontal dot spacing was following conditions. Each dot spacing was 2.0, 2.3, 2.5, 2.7, 2.9, 3.1mm. The subjects were asked to read one Braille character with controlled Braille size. The results of this experiment reveal that Braille beginners can read Braille accurately and quickly when both vertical and horizontal dot spacing are 3.1 mm or more. This knowledge will be helpful data in considering Braille size for acquired visually impaired persons.Keywords: paper-based Braille, vertical and horizontal dot spacing, readability, acquired visual impairment, Braille beginner
Procedia PDF Downloads 1789756 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 639755 Multidimensional Poverty and Child Cognitive Development
Authors: Bidyadhar Dehury, Sanjay Kumar Mohanty
Abstract:
According to the Right to Education Act of India, education is the fundamental right of all children of age group 6-14 year irrespective of their status. Using the unit level data from India Human Development Survey (IHDS), we tried to understand the inter-relationship between the level of poverty and the academic performance of the children aged 8-11 years. The level of multidimensional poverty is measured using five dimensions and 10 indicators using Alkire-Foster approach. The weighted deprivation score was obtained by giving equal weight to each dimension and indicators within the dimension. The weighted deprivation score varies from 0 to 1 and grouped into four categories as non-poor, vulnerable, multidimensional poor and sever multidimensional poor. The academic performance index was measured using three variables reading skills, math skills and writing skills using PCA. The bivariate and multivariate analysis was used in the analysis. The outcome variable was ordinal. So the predicted probabilities were calculated using the ordinal logistic regression. The predicted probabilities of good academic performance index was 0.202 if the child was sever multidimensional poor, 0.235 if the child was multidimensional poor, 0.264 if the child was vulnerable, and 0.316 if the child was non-poor. Hence, if the level of poverty among the children decreases from sever multidimensional poor to non-poor, the probability of good academic performance increases.Keywords: multidimensional poverty, academic performance index, reading skills, math skills, writing skills, India
Procedia PDF Downloads 5949754 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models
Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai
Abstract:
Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.Keywords: plant identification, CNN, image processing, vision transformer, classification
Procedia PDF Downloads 1049753 Testing of Canadian Integrated Healthcare and Social Services Initiatives with an Evidence-Based Case Definition for Healthcare and Social Services Integrations
Authors: S. Cheng, C. Catallo
Abstract:
Introduction: Canada's healthcare and social services systems are failing high risk, vulnerable older adults. Care for vulnerable older Canadians (65 and older) is not optimal in Canada. It does not address the care needs of vulnerable, high risk adults using a holistic approach. Given the growing aging population, and the care needs for seniors with complex conditions is one of the highest in Canada's health care system, there is a sense of urgency to optimize care. Integration of health and social services is an emerging trend in Canada when compared to European countries. There is no common and universal understanding of healthcare and social services integration within the country. Consequently, a clear understanding and definition of integrated health and social services are absent in Canada. Objectives: A study was undertaken to develop a case definition for integrated health and social care initiatives that serve older adults, which was then tested against three Canadian integrated initiatives. Methodology: A limited literature review was undertaken to identify common characteristics of integrated health and social care initiatives that serve older adults, and comprised both scientific and grey literature, in order to develop a case definition. Three Canadian integrated initiatives that are located in the province of Ontario, were identified using an online search and a screening process. They were surveyed to determine if the literature-based integration definition applied to them. Results: The literature showed that there were 24 common healthcare and social services integration characteristics that could be categorized into ten themes: 1) patient-care approach; 2) program goals; 3) measurement; 4) service and care quality; 5) accountability and responsibility; 6) information sharing; 7) Decision-making and problem-solving; 8) culture; 9) leadership; and 10) staff and professional interaction. The three initiatives showed agreement on all the integration characteristics except for those characteristics associated with healthcare and social care professional interaction, collaborative leadership and shared culture. This disagreement may be due to several reasons, including the existing governance divide between the healthcare and social services sectors within the province of Ontario that has created a ripple effect in how professions in the two different sectors interact. In addition, the three initiatives may be at maturing levels of integration, which may explain disagreement on the characteristics associated with leadership and culture. Conclusions: The development of a case definition for healthcare and social services integration that incorporates common integration characteristics can act as a useful instrument in identifying integrated healthcare and social services, particularly given the emerging and evolutionary state of this phenomenon within Canada.Keywords: Canada, case definition, healthcare and social services integration, integration, seniors health, services delivery
Procedia PDF Downloads 1559752 Listening Anxiety in Iranian EFL learners
Authors: Samaneh serraj
Abstract:
Listening anxiety has a detrimental effect on language learners. Through a qualitative study on Iranian EFL learners several factors were identified as having influence on their listening anxiety. These factors were divided into three categories, i.e. individual factors (nerves and emotionality, using inappropriate strategies and lack of practice), input factors (lack of time to process, lack of visual support, nature of speech and level of difficulty) and environmental factors (instructors, peers and class environment).Keywords: listening Comprehension, Listening Anxiety, Foreign language learners
Procedia PDF Downloads 4709751 A Computational Approach for the Prediction of Relevant Olfactory Receptors in Insects
Authors: Zaide Montes Ortiz, Jorge Alberto Molina, Alejandro Reyes
Abstract:
Insects are extremely successful organisms. A sophisticated olfactory system is in part responsible for their survival and reproduction. The detection of volatile organic compounds can positively or negatively affect many behaviors in insects. Compounds such as carbon dioxide (CO2), ammonium, indol, and lactic acid are essential for many species of mosquitoes like Anopheles gambiae in order to locate vertebrate hosts. For instance, in A. gambiae, the olfactory receptor AgOR2 is strongly activated by indol, which accounts for almost 30% of human sweat. On the other hand, in some insects of agricultural importance, the detection and identification of pheromone receptors (PRs) in lepidopteran species has become a promising field for integrated pest management. For example, with the disruption of the pheromone receptor, BmOR1, mediated by transcription activator-like effector nucleases (TALENs), the sensitivity to bombykol was completely removed affecting the pheromone-source searching behavior in male moths. Then, the detection and identification of olfactory receptors in the genomes of insects is fundamental to improve our understanding of the ecological interactions, and to provide alternatives in the integrated pests and vectors management. Hence, the objective of this study is to propose a bioinformatic workflow to enhance the detection and identification of potential olfactory receptors in genomes of relevant insects. Applying Hidden Markov models (Hmms) and different computational tools, potential candidates for pheromone receptors in Tuta absoluta were obtained, as well as potential carbon dioxide receptors in Rhodnius prolixus, the main vector of Chagas disease. This study showed the validity of a bioinformatic workflow with a potential to improve the identification of certain olfactory receptors in different orders of insects.Keywords: bioinformatic workflow, insects, olfactory receptors, protein prediction
Procedia PDF Downloads 1499750 Sensitivity and Uncertainty Analysis of One Dimensional Shape Memory Alloy Constitutive Models
Authors: A. B. M. Rezaul Islam, Ernur Karadogan
Abstract:
Shape memory alloys (SMAs) are known for their shape memory effect and pseudoelasticity behavior. Their thermomechanical behaviors are modeled by numerous researchers using microscopic thermodynamic and macroscopic phenomenological point of view. Tanaka, Liang-Rogers and Ivshin-Pence models are some of the most popular SMA macroscopic phenomenological constitutive models. They describe SMA behavior in terms of stress, strain and temperature. These models involve material parameters and they have associated uncertainty present in them. At different operating temperatures, the uncertainty propagates to the output when the material is subjected to loading followed by unloading. The propagation of uncertainty while utilizing these models in real-life application can result in performance discrepancies or failure at extreme conditions. To resolve this, we used probabilistic approach to perform the sensitivity and uncertainty analysis of Tanaka, Liang-Rogers, and Ivshin-Pence models. Sobol and extended Fourier Amplitude Sensitivity Testing (eFAST) methods have been used to perform the sensitivity analysis for simulated isothermal loading/unloading at various operating temperatures. As per the results, it is evident that the models vary due to the change in operating temperature and loading condition. The average and stress-dependent sensitivity indices present the most significant parameters at several temperatures. This work highlights the sensitivity and uncertainty analysis results and shows comparison of them at different temperatures and loading conditions for all these models. The analysis presented will aid in designing engineering applications by eliminating the probability of model failure due to the uncertainty in the input parameters. Thus, it is recommended to have a proper understanding of sensitive parameters and the uncertainty propagation at several operating temperatures and loading conditions as per Tanaka, Liang-Rogers, and Ivshin-Pence model.Keywords: constitutive models, FAST sensitivity analysis, sensitivity analysis, sobol, shape memory alloy, uncertainty analysis
Procedia PDF Downloads 1449749 Presenting an Integrated Framework for the Introduction and Evaluation of Social Media in Enterprises
Authors: Gerhard Peter
Abstract:
In this paper, we present an integrated framework that governs the introduction of social media into enterprises and its evaluation. It is argued that the framework should address the following issues: (1) the contribution of social media for increasing efficiency and improving the quality of working life; (2) the level on which this contribution happens (i.e., individual, team, or organisation); (3) a description of the processes for implementing and evaluating social media; and the role of (4) organisational culture and (5) management. We also report the results of a case study where the framework has been employed to introduce a social networking platform at a German enterprise. This paper only considers the internal use of social media.Keywords: case study, enterprise 2.0, framework, introducing and evaluating social media, social media
Procedia PDF Downloads 3689748 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market
Authors: Cristian Păuna
Abstract:
In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex
Procedia PDF Downloads 1309747 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties
Authors: G. Martino, F. Silva, E. Marchal
Abstract:
The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization
Procedia PDF Downloads 1239746 Measuring Environmental Efficiency of Energy in OPEC Countries
Authors: Bahram Fathi, Seyedhossein Sajadifar, Naser Khiabani
Abstract:
Data envelopment analysis (DEA) has recently gained popularity in energy efficiency analysis. A common feature of the previously proposed DEA models for measuring energy efficiency performance is that they treat energy consumption as an input within a production framework without considering undesirable outputs. However, energy use results in the generation of undesirable outputs as byproducts of producing desirable outputs. Within a joint production framework of both desirable and undesirable outputs, this paper presents several DEA-type linear programming models for measuring energy efficiency performance. In addition to considering undesirable outputs, our models treat different energy sources as different inputs so that changes in energy mix could be accounted for in evaluating energy efficiency. The proposed models are applied to measure the energy efficiency performances of 12 OPEC countries and the results obtained are presented.Keywords: energy efficiency, undesirable outputs, data envelopment analysis
Procedia PDF Downloads 7369745 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard
Authors: Arash Gharibi
Abstract:
Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.Keywords: metamodel, modeling, interoperability, reuse
Procedia PDF Downloads 1989744 Implied Adjusted Volatility by Leland Option Pricing Models: Evidence from Australian Index Options
Authors: Mimi Hafizah Abdullah, Hanani Farhah Harun, Nik Ruzni Nik Idris
Abstract:
With the implied volatility as an important factor in financial decision-making, in particular in option pricing valuation, and also the given fact that the pricing biases of Leland option pricing models and the implied volatility structure for the options are related, this study considers examining the implied adjusted volatility smile patterns and term structures in the S&P/ASX 200 index options using the different Leland option pricing models. The examination of the implied adjusted volatility smiles and term structures in the Australian index options market covers the global financial crisis in the mid-2007. The implied adjusted volatility was found to escalate approximately triple the rate prior the crisis.Keywords: implied adjusted volatility, financial crisis, Leland option pricing models, Australian index options
Procedia PDF Downloads 3799743 Evaluation of Environmental, Technical, and Economic Indicators of a Fused Deposition Modeling Process
Authors: M. Yosofi, S. Ezeddini, A. Ollivier, V. Lavaste, C. Mayousse
Abstract:
Additive manufacturing processes have changed significantly in a wide range of industries and their application progressed from rapid prototyping to production of end-use products. However, their environmental impact is still a rather open question. In order to support the growth of this technology in the industrial sector, environmental aspects should be considered and predictive models may help monitor and reduce the environmental footprint of the processes. This work presents predictive models based on a previously developed methodology for the environmental impact evaluation combined with a technical and economical assessment. Here we applied the methodology to the Fused Deposition Modeling process. First, we present the predictive models relative to different types of machines. Then, we present a decision-making tool designed to identify the optimum manufacturing strategy regarding technical, economic, and environmental criteria.Keywords: additive manufacturing, decision-makings, environmental impact, predictive models
Procedia PDF Downloads 1319742 Lean Healthcare: Barriers and Enablers in the Colombian Context
Authors: Erika Ruiz, Nestor Ortiz
Abstract:
Lean philosophy has evolved over time and has been implemented both in manufacturing and services, more recently lean has been integrated in the companies of the health sector. Currently it is important to understand the successful way to implement this philosophy and try to identify barriers and enablers to the sustainability of lean healthcare. The main purpose of this research is to identify the barriers and enablers in the implementation of Lean Healthcare based on case studies of Colombian healthcare centers. In order to do so, we conducted semi-structured interviews based on a maturity model. The main results indicate that the success of Lean implementation depends on its adaptation to contextual factors. In addition, in the Colombian context were identified new factors such as organizational culture, management models, integration of the care and administrative departments and triple helix relationship.Keywords: barriers, enablers, implementation, lean healthcare, sustainability
Procedia PDF Downloads 3669741 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 1949740 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five
Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz
Abstract:
Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.Keywords: hydroxyl, global model, model maintenance, near infrared, polyol
Procedia PDF Downloads 1359739 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue
Authors: Rachel Y. Zhang, Christopher K. Anderson
Abstract:
A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine
Procedia PDF Downloads 1339738 Text Similarity in Vector Space Models: A Comparative Study
Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge
Abstract:
Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.Keywords: big data, patent, text embedding, text similarity, vector space model
Procedia PDF Downloads 175