Search results for: Process models
19814 Empirical Modeling of Air Dried Rubberwood Drying System
Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit
Abstract:
Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (R²), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (R² = 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.Keywords: empirical models, rubberwood, moisture ratio, hot air drying
Procedia PDF Downloads 26719813 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model
Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim
Abstract:
Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph
Procedia PDF Downloads 28019812 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.Keywords: wind fragility, glass window, high rise building, wind disaster
Procedia PDF Downloads 25619811 Improving the Technology of Assembly by Use of Computer Calculations
Authors: Mariya V. Yanyukina, Michael A. Bolotov
Abstract:
Assembling accuracy is the degree of accordance between the actual values of the parameters obtained during assembly, and the values specified in the assembly drawings and technical specifications. However, the assembling accuracy depends not only on the quality of the production process but also on the correctness of the assembly process. Therefore, preliminary calculations of assembly stages are carried out to verify the correspondence of real geometric parameters to their acceptable values. In the aviation industry, most calculations involve interacting dimensional chains. This greatly complicates the task. Solving such problems requires a special approach. The purpose of this article is to carry out the problem of improving the technology of assembly of aviation units by use of computer calculations. One of the actual examples of the assembly unit, in which there is an interacting dimensional chain, is the turbine wheel of gas turbine engine. Dimensional chain of turbine wheel is formed by geometric parameters of disk and set of blades. The interaction of the dimensional chain consists in the formation of two chains. The first chain is formed by the dimensions that determine the location of the grooves for the installation of the blades, and the dimensions of the blade roots. The second dimensional chain is formed by the dimensions of the airfoil shroud platform. The interaction of the dimensional chain of the turbine wheel is the interdependence of the first and second chains by means of power circuits formed by a plurality of middle parts of the turbine blades. The timeliness of the calculation of the dimensional chain of the turbine wheel is the need to improve the technology of assembly of this unit. The task at hand contains geometric and mathematical components; therefore, its solution can be implemented following the algorithm: 1) research and analysis of production errors by geometric parameters; 2) development of a parametric model in the CAD system; 3) creation of set of CAD-models of details taking into account actual or generalized distributions of errors of geometrical parameters; 4) calculation model in the CAE-system, loading of various combinations of models of parts; 5) the accumulation of statistics and analysis. The main task is to pre-simulate the assembly process by calculating the interacting dimensional chains. The article describes the approach to the solution from the point of view of mathematical statistics, implemented in the software package Matlab. Within the framework of the study, there are data on the measurement of the components of the turbine wheel-blades and disks, as a result of which it is expected that the assembly process of the unit will be optimized by solving dimensional chains.Keywords: accuracy, assembly, interacting dimension chains, turbine
Procedia PDF Downloads 37319810 Crafting Robust Business Model Innovation Path with Generative Artificial Intelligence in Start-up SMEs
Authors: Ignitia Motjolopane
Abstract:
Small and medium enterprises (SMEs) play an important role in economies by contributing to economic growth and employment. In the fourth industrial revolution, the convergence of technologies and the changing nature of work created pressures on economies globally. Generative artificial intelligence (AI) may support SMEs in exploring, exploiting, and transforming business models to align with their growth aspirations. SMEs' growth aspirations fall into four categories: subsistence, income, growth, and speculative. Subsistence-oriented firms focus on meeting basic financial obligations and show less motivation for business model innovation. SMEs focused on income, growth, and speculation are more likely to pursue business model innovation to support growth strategies. SMEs' strategic goals link to distinct business model innovation paths depending on whether SMEs are starting a new business, pursuing growth, or seeking profitability. Integrating generative artificial intelligence in start-up SME business model innovation enhances value creation, user-oriented innovation, and SMEs' ability to adapt to dynamic changes in the business environment. The existing literature may lack comprehensive frameworks and guidelines for effectively integrating generative AI in start-up reiterative business model innovation paths. This paper examines start-up business model innovation path with generative artificial intelligence. A theoretical approach is used to examine start-up-focused SME reiterative business model innovation path with generative AI. Articulating how generative AI may be used to support SMEs to systematically and cyclically build the business model covering most or all business model components and analyse and test the BM's viability throughout the process. As such, the paper explores generative AI usage in market exploration. Moreover, market exploration poses unique challenges for start-ups compared to established companies due to a lack of extensive customer data, sales history, and market knowledge. Furthermore, the paper examines the use of generative AI in developing and testing viable value propositions and business models. In addition, the paper looks into identifying and selecting partners with generative AI support. Selecting the right partners is crucial for start-ups and may significantly impact success. The paper will examine generative AI usage in choosing the right information technology, funding process, revenue model determination, and stress testing business models. Stress testing business models validate strong and weak points by applying scenarios and evaluating the robustness of individual business model components and the interrelation between components. Thus, the stress testing business model may address these uncertainties, as misalignment between an organisation and its environment has been recognised as the leading cause of company failure. Generative AI may be used to generate business model stress-testing scenarios. The paper is expected to make a theoretical and practical contribution to theory and approaches in crafting a robust business model innovation path with generative artificial intelligence in start-up SMEs.Keywords: business models, innovation, generative AI, small medium enterprises
Procedia PDF Downloads 7019809 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 15119808 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model
Authors: Jamil Ahmed
Abstract:
Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.Keywords: authorization, access control model, role based access control, attribute based access control
Procedia PDF Downloads 15919807 Development of new Ecological Cleaning Process of Metal Sheets
Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario
Abstract:
In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.
Procedia PDF Downloads 35419806 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion
Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You
Abstract:
For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.Keywords: case-based reasoning, internal thread, cold extrusion, process planning
Procedia PDF Downloads 51019805 Gastronomy: The Preferred Digital Business Models and Impacts in Business Economics within Hospitality, Tourism, and Catering Sectors through Online Commerce
Authors: John Oupa Hlatshwayo
Abstract:
Background: There seem to be preferred digital business models with varying impacts within hospitality, tourism and catering sub-sectors explored through online commerce, as all are ingrained in the business economics domain. Aim: A study aims to establish if such phenomena (Digital Business Models) exist and to what extent if any, within the hospitality, tourism and catering industries, respectively. Setting: This is a qualitative study conducted by exploring several (Four) institutions globally through Case Studies. Method: This research explored explanatory case studies to answer questions about ‘how’ or ’why’ with little control by a researcher over the occurrence of events. It is qualitative research, deductive, and inductive methods. Hence, a comprehensive approach to analyzing qualitative data was attainable through immersion by reading to understand the information. Findings: The results corroborated the notion that digital business models are applicable, by and large, in business economics. Thus, three sectors wherein enterprises operate in the business economics sphere have been narrowed down i.e. hospitality, tourism and catering, are also referred to as triangular polygons due to the atypical nature of being ‘stand-alone’, yet ‘sub-sectors’, but there are confounding factors to consider. Conclusion: The significance of digital business models and digital transformation shows an inevitable merger between business and technology within Hospitality, Tourism, and Catering. Contribution: Such symbiotic relationship of business and technology, persistent evolution of clients’ interface with end-products, forever changing market, current adaptation as well as adjustment to ‘new world order’ by enterprises must be embraced constantly without fail by Business Practitioners, Academics, Business Students, Organizations and Governments.Keywords: digital business models, hospitality, tourism, catering, business economics
Procedia PDF Downloads 1719804 Managing Uncertainty in Unmanned Aircraft System Safety Performance Requirements Compliance Process
Authors: Achim Washington, Reece Clothier, Jose Silva
Abstract:
System Safety Regulations (SSR) are a central component to the airworthiness certification of Unmanned Aircraft Systems (UAS). There is significant debate on the setting of appropriate SSR for UAS. Putting this debate aside, the challenge lies in how to apply the system safety process to UAS, which lacks the data and operational heritage of conventionally piloted aircraft. The limited knowledge and lack of operational data result in uncertainty in the system safety assessment of UAS. This uncertainty can lead to incorrect compliance findings and the potential certification and operation of UAS that do not meet minimum safety performance requirements. The existing system safety assessment and compliance processes, as used for conventional piloted aviation, do not adequately account for the uncertainty, limiting the suitability of its application to UAS. This paper discusses the challenges of undertaking system safety assessments for UAS and presents current and envisaged research towards addressing these challenges. It aims to highlight the main advantages associated with adopting a risk based framework to the System Safety Performance Requirement (SSPR) compliance process that is capable of taking the uncertainty associated with each of the outputs of the system safety assessment process into consideration. Based on this study, it is made clear that developing a framework tailored to UAS, would allow for a more rational, transparent and systematic approach to decision making. This would reduce the need for conservative assumptions and take the risk posed by each UAS into consideration while determining its state of compliance to the SSR.Keywords: Part 1309 regulations, risk models, uncertainty, unmanned aircraft systems
Procedia PDF Downloads 18619803 Designing a Model to Increase the Flow of Circular Economy Startups Using a Systemic and Multi-Generational Approach
Authors: Luís Marques, João Rocha, Andreia Fernandes, Maria Moura, Cláudia Caseiro, Filipa Figueiredo, João Nunes
Abstract:
The implementation of circularity strategies other than recycling, such as reducing the amount of raw material, as well as reusing or sharing existing products, remains marginal. The European Commission announced that the transition towards a more circular economy could lead to the net creation of about 700,000 jobs in Europe by 2030, through additional labour demand from recycling plants, repair services and other circular activities. Efforts to create new circular business models in accordance with completely circular processes, as opposed to linear ones, have increased considerably in recent years. In order to create a societal Circular Economy transition model, it is necessary to include innovative solutions, where startups play a key role. Early-stage startups based on new business models according to circular processes often face difficulties in creating enough impact. The StartUp Zero Program designs a model and approach to increase the flow of startups in the Circular Economy field, focusing on a systemic decision analysis and multi-generational approach, considering Multi-Criteria Decision Analysis to support a decision-making tool, which is also supported by the use of a combination of an Analytical Hierarchy Process and Multi-Attribute Value Theory methods. We define principles, criteria and indicators for evaluating startup prerogatives, quantifying the evaluation process in a unique result. Additionally, this entrepreneurship program spanning 16 months involved more than 2400 young people, from ages 14 to 23, in more than 200 interaction activities.Keywords: circular economy, entrepreneurship, startups;, multi-criteria decision analysis
Procedia PDF Downloads 10519802 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 13219801 Using Traffic Micro-Simulation to Assess the Benefits of Accelerated Pavement Construction for Reducing Traffic Emissions
Authors: Sudipta Ghorai, Ossama Salem
Abstract:
Pavement maintenance, repair, and rehabilitation (MRR) processes may have considerable environmental impacts due to traffic disruptions associated with work zones. The simulation models in use to predict the emission of work zones were mostly static emission factor models (SEFD). SEFD calculates emissions based on average operation conditions e.g. average speed and type of vehicles. Although these models produce accurate results for large-scale planning studies, they are not suitable for analyzing driving conditions at the micro level such as acceleration, deceleration, idling, cruising, and queuing in a work zone. The purpose of this study is to prepare a comprehensive work zone environmental assessment (WEA) framework to calculate the emissions caused due to disrupted traffic; by integrating traffic microsimulation tools with emission models. This will help highway officials to assess the benefits of accelerated construction and opt for the most suitable TMP not only economically but also from an environmental point of view.Keywords: accelerated construction, pavement MRR, traffic microsimulation, congestion, emissions
Procedia PDF Downloads 44919800 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 22919799 Discrete Choice Modeling in Education: Evaluating Early Childhood Educators’ Practices
Authors: Michalis Linardakis, Vasilis Grammatikopoulos, Athanasios Gregoriadis, Kalliopi Trouli
Abstract:
Discrete choice models belong to the family of Conjoint analysis that are applied on the preferences of the respondents towards a set of scenarios that describe alternative choices. The scenarios have been pre-designed to cover all the attributes of the alternatives that may affect the choices. In this study, we examine how preschool educators integrate physical activities into their everyday teaching practices through the use of discrete choice models. One of the advantages of discrete choice models compared to other more traditional data collection methods (e.g. questionnaires and interviews that use ratings) is that the respondent is called to select among competitive and realistic alternatives, rather than objectively rate each attribute that the alternatives may have. We present the effort to construct and choose representative attributes that would cover all possible choices of the respondents, and the scenarios that have arisen. For the purposes of the study, we used a sample of 50 preschool educators in Greece that responded to 4 scenarios (from the total of 16 scenarios that the orthogonal design resulted), with each scenario having three alternative teaching practices. Seven attributes of the alternatives were used in the scenarios. For the analysis of the data, we used multinomial logit model with random effects, multinomial probit model and generalized mixed logit model. The conclusions drawn from the estimated parameters of the models are discussed.Keywords: conjoint analysis, discrete choice models, educational data, multivariate statistical analysis
Procedia PDF Downloads 46519798 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.Keywords: ARIMA Models, exponential smoothing, Holt-Winter model
Procedia PDF Downloads 30019797 Concept Drifts Detection and Localisation in Process Mining
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining
Procedia PDF Downloads 34519796 Adsorption of Pb(II) with MOF [Co2(Btec)(Bipy)(DMF)2]N in Aqueous Solution
Authors: E. Gil, A. Zepeda, J. Rivera, C. Ben-Youssef, S. Rincón
Abstract:
Water pollution has become one of the most serious environmental problems. Multiple methods have been proposed for the removal of Pb(II) from contaminated water. Among these, adsorption processes have shown to be more efficient, cheaper and easier to handle with respect to other treatment methods. However, research for adsorbents with high adsorption capacities is still necessary. For this purpose, we proposed in this work the study of metal-organic Framework [Co2(btec)(bipy)(DMF)2]n (MOF-Co) as adsorbent material of Pb (II) in aqueous media. MOF-Co was synthesized by a simple method. Firstly 4, 4’ dipyridyl, 1,2,4,5 benzenetetracarboxylic acid, cobalt (II) and nitrate hexahydrate were first mixed each one in N,N dimethylformamide (DMF) and then, mixed in a reactor altogether. The obtained solution was heated at 363 K in a muffle during 68 h to complete the synthesis. It was washed and dried, obtaining MOF-Co as the final product. MOF-Co was characterized before and after the adsorption process by Fourier transforms infrared spectra (FTIR) and X-ray photoelectron spectroscopy (XPS). The Pb(II) in aqueous media was detected by Absorption Atomic Spectroscopy (AA). In order to evaluate the adsorption process in the presence of Pb(II) in aqueous media, the experiments were realized in flask of 100 ml the work volume at 200 rpm, with different MOF-Co quantities (0.0125 and 0.025 g), pH (2-6), contact time (0.5-6 h) and temperature (298,308 and 318 K). The kinetic adsorption was represented by pseudo-second order model, which suggests that the adsorption took place through chemisorption or chemical adsorption. The best adsorption results were obtained at pH 5. Langmuir, Freundlich and BET equilibrium isotherms models were used to study the adsorption of Pb(II) with 0.0125 g of MOF-Co, in the presence of different concentration of Pb(II) (20-200 mg/L, 100 mL, pH 5) with 4 h of reaction. The correlation coefficients (R2) of the different models show that the Langmuir model is better than Freundlich and BET model with R2=0.97 and a maximum adsorption capacity of 833 mg/g. Therefore, the Langmuir model can be used to best describe the Pb(II) adsorption in monolayer behavior on the MOF-Co. This value is the highest when compared to other materials such as the graphene/activated carbon composite (217 mg/g), biomass fly ashes (96.8 mg/g), PVA/PAA gel (194.99 mg/g) and MOF with Ag12 nanoparticles (120 mg/g).Keywords: adsorption, heavy metals, metal-organic frameworks, Pb(II)
Procedia PDF Downloads 21419795 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).Keywords: interoperability, interoperability maturity model, school management system, scoping review
Procedia PDF Downloads 20919794 Development of a Miniature Laboratory Lactic Goat Cheese Model to Study the Expression of Spoilage by Pseudomonas Spp. In Cheeses
Authors: Abirami Baleswaran, Christel Couderc, Loubnah Belahcen, Jean Dayde, Hélène Tormo, Gwénaëlle Jard
Abstract:
Cheeses are often reported to be spoiled by Pseudomonas spp., responsible for defects in appearance, texture, taste, and smell, leading to their non-marketing and even their destruction. Despite preventive actions, problems linked to Pseudomonas spp. are difficult to control by the lack of knowledge and control of these contaminants during the cheese manufacturing. Lactic goat cheese producers are not spared by this problem and are looking for solutions to decrease the number of spoiled cheeses. To explore different hypotheses, experiments are needed. However, cheese-making experiments at the pilot scale are expensive and time consuming. Thus, there is a real need to develop a miniature cheeses model system under controlled conditions. In a previous study, several miniature cheese models corresponding to different type of commercial cheeses have been developed for different purposes. The models were, for example, used to study the influence of milk, starters cultures, pathogen inhibiting additives, enzymatic reactions, microflora, freezing process on cheese. Nevertheless, no miniature model was described on the lactic goat cheese. The aim of this work was to develop a miniature cheese model system under controlled laboratory conditions which resembles commercial lactic goat cheese to study Pseudomonas spp. spoilage during the manufacturing and ripening process. First, a protocol for the preparation of miniature cheeses (3.5 times smaller than a commercial one) was designed based on the cheese factorymanufacturing process. The process was adapted from “Rocamadour” technology and involves maturation of pasteurized milk, coagulation, removal of whey by centrifugation, moulding, and ripening in a little scale cellar. Microbiological (total bacterial count, yeast, molds) and physicochemical (pH, saltinmoisture, moisture in fat-free)analyses were performed on four key stages of the process (before salting, after salting, 1st day of ripening, and end of ripening). Factory and miniature cheeses volatilomewere also obtained after full scan Sift-MS cheese analysis. Then, Pseudomonas spp. strains isolated from contaminated cheeses were selected on their origin, their ability to produce pigments, and their enzymatic activities (proteolytic, lecithinasic, and lipolytic). Factory and miniature curds were inoculated by spotting selected strains on the cheese surface. The expression of cheese spoilage was evaluated by counting the level of Pseudomonas spp. during the ripening and by visual observation and under UVlamp. The physicochemical and microbiological compositions of miniature cheeses permitted to assess that miniature process resembles factory process. As expected, differences involatilomes were observed, probably due to the fact that miniature cheeses are made usingpasteurized milk to better control the microbiological conditions and also because the little format of cheese induced probably a difference during the ripening even if the humidity and temperature in the cellar were quite similar. The spoilage expression of Pseudomonas spp. was observed in miniature and factory cheeses. It confirms that the proposed model is suitable for the preparation of miniature cheese specimens in the spoilage study of Pseudomonas spp. in lactic cheeses. This kind of model could be deployed for other applications and other type of cheese.Keywords: cheese, miniature, model, pseudomonas spp, spoilage
Procedia PDF Downloads 13319793 Models, Methods and Technologies for Protection of Critical Infrastructures from Cyber-Physical Threats
Authors: Ivan Župan
Abstract:
Critical infrastructure is essential for the functioning of a country and is designated for special protection by governments worldwide. Due to the increase in smart technology usage in every facet of the industry, including critical infrastructure, the exposure to malicious cyber-physical attacks has grown in the last few years. Proper security measures must be undertaken in order to defend against cyber-physical threats that can disrupt the normal functioning of critical infrastructure and, consequently the functioning of the country. This paper provides a review of the scientific literature of models, methods and technologies used to protect from cyber-physical threats in industries. The focus of the literature was observed from three aspects. The first aspect, resilience, concerns itself with the robustness of the system’s defense against threats, as well as preparation and education about potential future threats. The second aspect concerns security risk management for systems with cyber-physical aspects, and the third aspect investigates available testbed environments for testing developed models on scaled models of vulnerable infrastructure.Keywords: critical infrastructure, cyber-physical security, smart industry, security methodology, security technology
Procedia PDF Downloads 7519792 DUSP16 Inhibition Rescues Neurogenic and Cognitive Deficits in Alzheimer's Disease Mice Models
Authors: Huimin Zhao, Xiaoquan Liu, Haochen Liu
Abstract:
The major challenge facing Alzheimer's Disease (AD) drug development is how to effectively improve cognitive function in clinical practice. Growing evidence indicates that stimulating hippocampal neurogenesis is a strategy for restoring cognition in animal models of AD. The mitogen-activated protein kinase (MAPK) pathway is a crucial factor in neurogenesis, which is negatively regulated by Dual-specificity phosphatase 16 (DUSP16). Transcriptome analysis of post-mortem brain tissue revealed up-regulation of DUSP16 expression in AD patients. Additionally, DUSP16 was involved in regulating the proliferation and neural differentiation of neural progenitor cells (NPCs). Nevertheless, whether the effect of DUSP16 on ameliorating cognitive disorders by influencing NPCs differentiation in AD mice remains unclear. Our study demonstrates an association between DUSP16 SNPs and clinical progression in individuals with mild cognitive impairment (MCI). Besides, we found that increased DUSP16 expression in both 3×Tg and SAMP8 models of AD led to NPC differentiation impairments. By silencing DUSP16, cognitive benefits, the induction of AHN and synaptic plasticity, were observed in AD mice. Furthermore, we found that DUSP16 is involved in the process of NPC differentiation by regulating c-Jun N-terminal kinase (JNK) phosphorylation. Moreover, the increased DUSP16 may be regulated by the ETS transcription factor (ELK1), which binds to the promoter region of DUSP16. Loss of ELK1 resulted in decreased DUSP16 mRNA and protein levels. Our data uncover a potential regulatory role for DUSP16 in adult hippocampal neurogenesis and provide a possibility to find the target of AD intervention.Keywords: alzheimer's disease, cognitive function, DUSP16, hippocampal neurogenesis
Procedia PDF Downloads 7219791 Recognition of Gene Names from Gene Pathway Figures Using Siamese Network
Authors: Muhammad Azam, Micheal Olaolu Arowolo, Fei He, Mihail Popescu, Dong Xu
Abstract:
The number of biological papers is growing quickly, which means that the number of biological pathway figures in those papers is also increasing quickly. Each pathway figure shows extensive biological information, like the names of genes and how the genes are related. However, manually annotating pathway figures takes a lot of time and work. Even though using advanced image understanding models could speed up the process of curation, these models still need to be made more accurate. To improve gene name recognition from pathway figures, we applied a Siamese network to map image segments to a library of pictures containing known genes in a similar way to person recognition from photos in many photo applications. We used a triple loss function and a triplet spatial pyramid pooling network by combining the triplet convolution neural network and the spatial pyramid pooling (TSPP-Net). We compared VGG19 and VGG16 as the Siamese network model. VGG16 achieved better performance with an accuracy of 93%, which is much higher than OCR results.Keywords: biological pathway, image understanding, gene name recognition, object detection, Siamese network, VGG
Procedia PDF Downloads 29119790 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach
Authors: Ali Hesari, Arezoo Esmaeeli
Abstract:
Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects
Procedia PDF Downloads 33919789 A Machine Learning Approach for Performance Prediction Based on User Behavioral Factors in E-Learning Environments
Authors: Naduni Ranasinghe
Abstract:
E-learning environments are getting more popular than any other due to the impact of COVID19. Even though e-learning is one of the best solutions for the teaching-learning process in the academic process, it’s not without major challenges. Nowadays, machine learning approaches are utilized in the analysis of how behavioral factors lead to better adoption and how they related to better performance of the students in eLearning environments. During the pandemic, we realized the academic process in the eLearning approach had a major issue, especially for the performance of the students. Therefore, an approach that investigates student behaviors in eLearning environments using a data-intensive machine learning approach is appreciated. A hybrid approach was used to understand how each previously told variables are related to the other. A more quantitative approach was used referred to literature to understand the weights of each factor for adoption and in terms of performance. The data set was collected from previously done research to help the training and testing process in ML. Special attention was made to incorporating different dimensionality of the data to understand the dependency levels of each. Five independent variables out of twelve variables were chosen based on their impact on the dependent variable, and by considering the descriptive statistics, out of three models developed (Random Forest classifier, SVM, and Decision tree classifier), random forest Classifier (Accuracy – 0.8542) gave the highest value for accuracy. Overall, this work met its goals of improving student performance by identifying students who are at-risk and dropout, emphasizing the necessity of using both static and dynamic data.Keywords: academic performance prediction, e learning, learning analytics, machine learning, predictive model
Procedia PDF Downloads 15719788 Ground State Phases in Two-Mode Quantum Rabi Models
Authors: Suren Chilingaryan
Abstract:
We study two models describing a single two-level system coupled to two boson field modes in either a parallel or orthogonal setup. Both models may be feasible for experimental realization through Raman adiabatic driving in cavity QED. We study their ground state configurations; that is, we find the quantum precursors of the corresponding semi-classical phase transitions. We found that the ground state configurations of both models present the same critical coupling as the quantum Rabi model. Around this critical coupling, the ground state goes from the so-called normal configuration with no excitation, the qubit in the ground state and the fields in the quantum vacuum state, to a ground state with excitations, the qubit in a superposition of ground and excited state, while the fields are not in the vacuum anymore, for the first model. The second model shows a more complex ground state configuration landscape where we find the normal configuration mentioned above, two single-mode configurations, where just one of the fields and the qubit are excited, and a dual-mode configuration, where both fields and the qubit are excited.Keywords: quantum optics, quantum phase transition, cavity QED, circuit QED
Procedia PDF Downloads 36719787 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 6619786 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 11919785 Improving the Management Systems of the Ownership Risks in Conditions of Transformation of the Russian Economy
Authors: Mikhail V. Khachaturyan
Abstract:
The article analyzes problems of improving the management systems of the ownership risks in the conditions of the transformation of the Russian economy. Among the main sources of threats business owners should highlight is the inefficiency of the implementation of business models and interaction with hired managers. In this context, it is particularly important to analyze the relationship of business models and ownership risks. The analysis of this problem appears to be relevant for a number of reasons: Firstly, the increased risk appetite of the owner directly affects the business model and the composition of his holdings; secondly, owners with significant stakes in the company are factors in the formation of particular types of risks for owners, for which relations have a significant influence on a firm's competitiveness and ultimately determines its survival; and thirdly, inefficient system of management ownership of risk is one of the main causes of mass bankruptcies, which significantly affects the stable operation of the economy as a whole. The separation of the processes of possession, disposal and use in modern organizations is the cause of not only problems in the process of interaction between the owner and managers in managing the organization as a whole, but also the asymmetric information about the kinds and forms of the main risks. Managers tend to avoid risky projects, inhibit the diversification of the organization's assets, while owners can insist on the development of such projects, with the aim not only of creating new values for themselves and consumers, but also increasing the value of the company as a result of increasing capital. In terms of separating ownership and management, evaluation of projects by the ratio of risk-yield requires preservation of the influence of the owner on the process of development and making management decisions. It is obvious that without a clearly structured system of participation of the owner in managing the risks of their business, further development is hopeless. In modern conditions of forming a risk management system, owners are compelled to compromise between the desire to increase the organization's ability to produce new value, and, consequently, increase its cost due to the implementation of risky projects and the need to tolerate the cost of lost opportunities of risk diversification. Improving the effectiveness of the management of ownership risks may also contribute to the revitalization of creditors on implementation claims to inefficient owners, which ultimately will contribute to the efficiency models of ownership control to exclude variants of insolvency. It is obvious that in modern conditions, the success of the model of the ownership of risk management and audit is largely determined by the ability and willingness of the owner to find a compromise between potential opportunities for expanding the firm's ability to create new value through risk and maintaining the current level of new value creation and an acceptable level of risk through the use of models of diversification.Keywords: improving, ownership risks, problem, Russia
Procedia PDF Downloads 349