Search results for: Gaussian mixture models
7225 Prediction of Permeability of Frozen Unsaturated Soil Using Van Genuchten Model and Fredlund-Xing Model in Soil Vision
Authors: Bhavita S. Dave, Jaimin Vaidya, Chandresh H. Solanki, Atul K.
Abstract:
To measure the permeability of a soil specimen, one of the basic assumptions of Darcy's law is that the soil sample should be saturated. Unlike saturated soils, the permeability of unsaturated soils cannot be found using conventional methods as it does not follow Darcy's law. Many empirical models, such as the Van Genuchten Model and Fredlund-Xing Model were suggested to predict permeability value for unsaturated soil. Such models use data from the soil-freezing characteristic curve to find fitting parameters for frozen unsaturated soils. In this study, soil specimens were subjected to 0, 1, 3, and 5 freezing-thawing (F-T) cycles for different degrees of saturation to have a wide range of suction, and its soil freezing characteristic curves were formulated for all F-T cycles. Changes in fitting parameters and relative permeability with subsequent F-T cycles are presented in this paper for both models.Keywords: frozen unsaturated soil, Fredlund Xing model, soil-freezing characteristic curve, Van Genuchten model
Procedia PDF Downloads 1877224 Comparison of Solar Radiation Models
Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci
Abstract:
Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)
Procedia PDF Downloads 3487223 Estimating 3D-Position of a Stationary Random Acoustic Source Using Bispectral Analysis of 4-Point Detected Signals
Authors: Katsumi Hirata
Abstract:
To develop the useful acoustic environmental recognition system, the method of estimating 3D-position of a stationary random acoustic source using bispectral analysis of 4-point detected signals is proposed. The method uses information about amplitude attenuation and propagation delay extracted from amplitude ratios and angles of auto- and cross-bispectra of the detected signals. It is expected that using bispectral analysis affects less influence of Gaussian noises than using conventional power spectral one. In this paper, the basic principle of the method is mentioned first, and its validity and features are considered from results of the fundamental experiments assumed ideal circumstances.Keywords: 4-point detection, a stationary random acoustic source, auto- and cross-bispectra, estimation of 3D-position
Procedia PDF Downloads 3577222 Botulism Clinical Experience and Update
Authors: Kevin Yeo, Christine Hall, Babinchak Tim
Abstract:
BAT® [Botulism Antitoxin Heptavalent (A,B,C,D,E,F,G)-(Equine)] anti-toxin is a mixture of equine immune globulin fragments indicated for the treatment of symptomatic botulism in adult and pediatric patients. The effectiveness of BAT anti-toxin is based on efficacy studies conducted in animal models. A general explanation of the pivotal animal studies, post market surveillance and outcomes of an observational patient registry for patients treated with BAT product distributed in the USA is briefly discussed. Overall it took 20 animal studies for two well-designed and appropriately powered pivotal efficacy studies – one in which the effectiveness of BAT was assessed against all 7 serotypes in the guinea pig, and the other where efficacy is confirmed in the Rhesus macaque using Serotype A. Clinical Experience for BAT to date involves approximately 600 adult and pediatric patients with suspected botulism. In pre-licensure, patient data was recorded under the US CDC expanded access program (259 adult and pediatric patients between 10 days to 88 years of age). In post licensure, greater than 350 patients to date have received BAT and been followed up by enhanced expanded access program. The analysis of the post market surveillance data provided a unique opportunity to demonstrate clinical benefit in the field study required by the animal rule. While the animal rule is applied because human efficacy studies are not ethical or feasible, a post-marketing requirement is to conduct a study to evaluate safety and clinical benefit when circumstances arise and demonstrate the favourable benefit-risk profile that supported licensure.Keywords: botulism, threat, clinical benefit, observational patient registry
Procedia PDF Downloads 1787221 Analysis of the Interference from Risk-Determining Factors of Cooperative and Conventional Construction Contracts
Authors: E. Harrer, M. Mauerhofer, T. Werginz
Abstract:
As a result of intensive competition, the building sector is suffering from a high degree of rivalry. Furthermore, there can be observed an unbalanced distribution of project risks. Clients are aimed to shift their own risks into the sphere of the constructors or planners. The consequence of this is that the number of conflicts between the involved parties is inordinately high or even increasing; an alternative approach to counter on that developments are cooperative project forms in the construction sector. This research compares conventional contract models and models with partnering agreements to examine the influence on project risks by an early integration of the involved parties. The goal is to show up deviations in different project stages from the design phase to the project transfer phase. These deviations are evaluated by a survey of experts from the three spheres: clients, contractors and planners. By rating the influence of the participants on specific risk factors it is possible to identify factors which are relevant for a smooth project execution.Keywords: building projects, contract models, partnering, project risks
Procedia PDF Downloads 2707220 Characteristics of Business Models of Industrial-Internet-of-Things Platforms
Authors: Peter Kress, Alexander Pflaum, Ulrich Loewen
Abstract:
The number of Internet-of-Things (IoT) platforms is steadily increasing across various industries, especially for smart factories, smart homes and smart mobility. Also in the manufacturing industry, the number of Industrial-IoT platforms is growing. Both IT players, start-ups and increasingly also established industry players and small-and-medium-enterprises introduce offerings for the connection of industrial equipment on platforms, enabled by advanced information and communication technology. Beside the offered functionalities, the established ecosystem of partners around a platform is one of the key differentiators to generate a competitive advantage. The key question is how platform operators design the business model around their platform to attract a high number of customers and partners to co-create value for the entire ecosystem. The present research tries to answer this question by determining the key characteristics of business models of successful platforms in the manufacturing industry. To achieve that, the authors selected an explorative qualitative research approach and created an inductive comparative case study. The authors generated valuable descriptive insights of the business model elements (e.g., value proposition, pricing model or partnering model) of various established platforms. Furthermore, patterns across the various cases were identified to derive propositions for the successful design of business models of platforms in the manufacturing industry.Keywords: industrial-internet-of-things, business models, platforms, ecosystems, case study
Procedia PDF Downloads 2427219 Modelling Social Influence and Cultural Variation in Global Low-Carbon Vehicle Transitions
Authors: Hazel Pettifor, Charlie Wilson, David Mccollum, Oreane Edelenbosch
Abstract:
Vehicle purchase is a technology adoption decision that will strongly influence future energy and emission outcomes. Global integrated assessment models (IAMs) provide valuable insights into the medium and long terms effects of socio-economic development, technological change and climate policy. In this paper we present a unique and transparent approach for improving the behavioural representation of these models by incorporating social influence effects to more accurately represent consumer choice. This work draws together strong conceptual thinking and robust empirical evidence to introduce heterogeneous and interconnected consumers who vary in their aversion to new technologies. Focussing on vehicle choice, we conduct novel empirical research to parameterise consumer risk aversion and how this is shaped by social and cultural influences. We find robust evidence for social influence effects, and variation between countries as a function of cultural differences. We then formulate an approach to modelling social influence which is implementable in both simulation and optimisation-type models. We use two global integrated assessment models (IMAGE and MESSAGE) to analyse four scenarios that introduce social influence and cultural differences between regions. These scenarios allow us to explore the interactions between consumer preferences and social influence. We find that incorporating social influence effects into global models accelerates the early deployment of electric vehicles and stimulates more widespread deployment across adopter groups. Incorporating cultural variation leads to significant differences in deployment between culturally divergent regions such as the USA and China. Our analysis significantly extends the ability of global integrated assessment models to provide policy-relevant analysis grounded in real-world processes.Keywords: behavioural realism, electric vehicles, social influence, vehicle choice
Procedia PDF Downloads 1867218 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana
Authors: Gautier Viaud, Paul-Henry Cournède
Abstract:
Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models
Procedia PDF Downloads 3027217 Hand in Hand with Indigenous People Worldwide through the Discovery of Indigenous Entrepreneurial Models: A Systematic Literature Review of International Indigenous Entrepreneurship
Authors: Francesca Croce
Abstract:
Governmental development strategies aimed at entrepreneurship as a major resource for economic development and poverty reduction of indigenous people. As initiatives and programs are local based, there is a need to better understand the contextual factors of indigenous entrepreneurial models. The purpose of this paper is, therefore, to analyze and integrated the indigenous entrepreneurship literature in order to identify the main models of indigenous entrepreneurship. To answer this need, a systematic literature review was conducted. Relevant articles were identified in selected electronic databases (ABI/Inform Global, Business Source Premier, Web of Science; International Bibliography of the Social Sciences, Academic Search, Sociological Abstract, Entrepreneurial Studies Sources and Bibliography of Native North America) and in selected electronic review. Beginning to 1st January 1995 (first International Day of the World’s Indigenous People), 59 academic articles were selected from 1411. Through systematic analysis of the cultural, social and organizational variables, the paper highlights that a typology of indigenous entrepreneurial models is possible thought the concept of entrepreneurial ecosystem, which includes the geographical position and the environment of the indigenous communities. The results show three models of indigenous entrepreneurship: the urban indigenous entrepreneurship, the semi-urban indigenous entrepreneurship, and rural indigenous entrepreneurship. After the introduction, the paper is organized as follows. In the first part theoretical and practical needs of a systematic literature review on indigenous entrepreneurship are provided. In the second part, the methodology, the selection process and evaluation of the articles are explained. In the third part, findings are presented and each indigenous entrepreneurial model characteristics are discussed. The results of this study bring a new theorization about indigenous entrepreneurship and may be useful for scientists in the field in search of overcoming the cognitive border of Indigenous business models still too little known. Also, the study is addressed to policy makers in charge of indigenous entrepreneurial development strategies more focused on contextual factors studies.Keywords: community development, entrepreneurial ecosystem, indigenous entrepreneurship model, indigenous people, systematic literature review
Procedia PDF Downloads 2777216 Comprehensive Analysis of Power Allocation Algorithms for OFDM Based Communication Systems
Authors: Rakesh Dubey, Vaishali Bahl, Dalveer Kaur
Abstract:
The spiralling urge for high rate data transmission over wireless mediums needs intelligent use of electromagnetic resources considering restrictions like power ingestion, spectrum competence, robustness against multipath propagation and implementation intricacy. Orthogonal frequency division multiplexing (OFDM) is a capable technique for next generation wireless communication systems. For such high rate data transfers there is requirement of proper allocation of resources like power and capacity amongst the sub channels. This paper illustrates various available methods of allocating power and the capacity requirement with the constraint of Shannon limit.Keywords: Additive White Gaussian Noise, Multi-Carrier Modulation, Orthogonal Frequency Division Multiplexing (OFDM), Signal to Noise Ratio (SNR), Water Filling
Procedia PDF Downloads 5527215 Quantitative Structure-Property Relationship Study of Base Dissociation Constants of Some Benzimidazoles
Authors: Sanja O. Podunavac-Kuzmanović, Lidija R. Jevrić, Strahinja Z. Kovačević
Abstract:
Benzimidazoles are a group of compounds with significant antibacterial, antifungal and anticancer activity. The studied compounds consist of the main benzimidazole structure with different combinations of substituens. This study is based on the two-dimensional and three-dimensional molecular modeling and calculation of molecular descriptors (physicochemical and lipophilicity descriptors) of structurally diverse benzimidazoles. Molecular modeling was carried out by using ChemBio3D Ultra version 14.0 software. The obtained 3D models were subjected to energy minimization using molecular mechanics force field method (MM2). The cutoff for structure optimization was set at a gradient of 0.1 kcal/Åmol. The obtained set of molecular descriptors was used in principal component analysis (PCA) of possible similarities and dissimilarities among the studied derivatives. After the molecular modeling, the quantitative structure-property relationship (QSPR) analysis was applied in order to get the mathematical models which can be used in prediction of pKb values of structurally similar benzimidazoles. The obtained models are based on statistically valid multiple linear regression (MLR) equations. The calculated cross-validation parameters indicate the high prediction ability of the established QSPR models. This study is financially supported by COST action CM1306 and the project No. 114-451-347/2015-02, financially supported by the Provincial Secretariat for Science and Technological Development of Vojvodina.Keywords: benzimidazoles, chemometrics, molecular modeling, molecular descriptors, QSPR
Procedia PDF Downloads 2857214 Analytical Tools for Multi-Residue Analysis of Some Oxygenated Metabolites of PAHs (Hydroxylated, Quinones) in Sediments
Authors: I. Berger, N. Machour, F. Portet-Koltalo
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are toxic and carcinogenic pollutants produced in majority by incomplete combustion processes in industrialized and urbanized areas. After being emitted in atmosphere, these persistent contaminants are deposited to soils or sediments. Even if persistent, some can be partially degraded (photodegradation, biodegradation, chemical oxidation) and they lead to oxygenated metabolites (oxy-PAHs) which can be more toxic than their parent PAH. Oxy-PAHs are less measured than PAHs in sediments and this study aims to compare different analytical tools in order to extract and quantify a mixture of four hydroxylated PAHs (OH-PAHs) and four carbonyl PAHs (quinones) in sediments. Methodologies: Two analytical systems – HPLC with on-line UV and fluorescence detectors (HPLC-UV-FLD) and GC coupled to a mass spectrometer (GC-MS) – were compared to separate and quantify oxy-PAHs. Microwave assisted extraction (MAE) was optimized to extract oxy-PAHs from sediments. Results: First OH-PAHs and quinones were analyzed in HPLC with on-line UV and fluorimetric detectors. OH-PAHs were detected with the sensitive FLD, but the non-fluorescent quinones were detected with UV. The limits of detection (LOD)s obtained were in the range (2-3)×10-4 mg/L for OH-PAHs and (2-3)×10-3 mg/L for quinones. Second, even if GC-MS is not well adapted to the analysis of the thermodegradable OH-PAHs and quinones without any derivatization step, it was used because of the advantages of the detector in terms of identification and of GC in terms of efficiency. Without derivatization, only two of the four quinones were detected in the range 1-10 mg/L (LODs=0.3-1.2 mg/L) and LODs were neither very satisfying for the four OH-PAHs (0.18-0.6 mg/L). So two derivatization processes were optimized, comparing to literature: one for silylation of OH-PAHs, one for acetylation of quinones. Silylation using BSTFA/TCMS 99/1 was enhanced using a mixture of catalyst solvents (pyridine/ethyle acetate) and finding the appropriate reaction duration (5-60 minutes). Acetylation was optimized at different steps of the process, including the initial volume of compounds to derivatize, the added amounts of Zn (0.1-0.25 g), the nature of the derivatization product (acetic anhydride, heptafluorobutyric acid…) and the liquid/liquid extraction at the end of the process. After derivatization, LODs were decreased by a factor 3 for OH-PAHs and by a factor 4 for quinones, all the quinones being now detected. Thereafter, quinones and OH-PAHs were extracted from spiked sediments using microwave assisted extraction (MAE) followed by GC-MS analysis. Several mixtures of solvents of different volumes (10-25 mL) and using different extraction temperatures (80-120°C) were tested to obtain the best recovery yields. Satisfactory recoveries could be obtained for quinones (70-96%) and for OH-PAHs (70-104%). Temperature was a critical factor which had to be controlled to avoid oxy-PAHs degradation during the MAE extraction process. Conclusion: Even if MAE-GC-MS was satisfactory to analyze these oxy-PAHs, MAE optimization has to be carried on to obtain a most appropriate extraction solvent mixture, allowing a direct injection in the HPLC-UV-FLD system, which is more sensitive than GC-MS and does not necessitate a previous long derivatization step.Keywords: derivatizations for GC-MS, microwave assisted extraction, on-line HPLC-UV-FLD, oxygenated PAHs, polluted sediments
Procedia PDF Downloads 2867213 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title
Authors: Gangmin Li, Fan Yang
Abstract:
Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting
Procedia PDF Downloads 527212 An Adaptive CFAR Algorithm Based on Automatic Censoring in Heterogeneous Environments
Authors: Naime Boudemagh
Abstract:
In this work, we aim to improve the detection performances of radar systems. To this end, we propose and analyze a novel censoring technique of undesirable samples, of priori unknown positions, that may be present in the environment under investigation. Therefore, we consider heterogeneous backgrounds characterized by the presence of some irregularities such that clutter edge transitions and/or interfering targets. The proposed detector, termed automatic censoring constant false alarm (AC-CFAR), operates exclusively in a Gaussian background. It is built to allow the segmentation of the environment to regions and switch automatically to the appropriate detector; namely, the cell averaging CFAR (CA-CFAR), the censored mean level CFAR (CMLD-CFAR) or the order statistic CFAR (OS-CFAR). Monte Carlo simulations show that the AC-CFAR detector performs like the CA-CFAR in a homogeneous background. Moreover, the proposed processor exhibits considerable robustness in a heterogeneous background.Keywords: CFAR, automatic censoring, heterogeneous environments, radar systems
Procedia PDF Downloads 6007211 A Sentence-to-Sentence Relation Network for Recognizing Textual Entailment
Authors: Isaac K. E. Ampomah, Seong-Bae Park, Sang-Jo Lee
Abstract:
Over the past decade, there have been promising developments in Natural Language Processing (NLP) with several investigations of approaches focusing on Recognizing Textual Entailment (RTE). These models include models based on lexical similarities, models based on formal reasoning, and most recently deep neural models. In this paper, we present a sentence encoding model that exploits the sentence-to-sentence relation information for RTE. In terms of sentence modeling, Convolutional neural network (CNN) and recurrent neural networks (RNNs) adopt different approaches. RNNs are known to be well suited for sequence modeling, whilst CNN is suited for the extraction of n-gram features through the filters and can learn ranges of relations via the pooling mechanism. We combine the strength of RNN and CNN as stated above to present a unified model for the RTE task. Our model basically combines relation vectors computed from the phrasal representation of each sentence and final encoded sentence representations. Firstly, we pass each sentence through a convolutional layer to extract a sequence of higher-level phrase representation for each sentence from which the first relation vector is computed. Secondly, the phrasal representation of each sentence from the convolutional layer is fed into a Bidirectional Long Short Term Memory (Bi-LSTM) to obtain the final sentence representations from which a second relation vector is computed. The relations vectors are combined and then used in then used in the same fashion as attention mechanism over the Bi-LSTM outputs to yield the final sentence representations for the classification. Experiment on the Stanford Natural Language Inference (SNLI) corpus suggests that this is a promising technique for RTE.Keywords: deep neural models, natural language inference, recognizing textual entailment (RTE), sentence-to-sentence relation
Procedia PDF Downloads 3487210 Zero Valent Iron Algal Biocomposite for the Removal of Crystal Violet from Aqueous Solution: Box-Behnken Optimization and Fixed Bed Column Studies
Authors: M. Jerold, V. Sivasubramanian
Abstract:
In this study, nano zero valent iron Sargassum swartzii (nZVI-SS) biocomposite a marine algal based biosorbent was used for the removal of simulated crystal violet (CV) in batch and continuous fixed bed operation. The Box-Behnen design (BBD) experimental results revealed the biosoprtion was maximum at pH 7.5, biosorbent dosage 0.1 g/L and initial CV concentration of 100 mg/L. The effect of various column parameters like bed depth (3, 6 and 9 cm), flow rate (5, 10 and 15 mL/min) and influent CV concentration (5, 10 and 15 mg/L) were investigated. The exhaustion time increased with increase of bed depth, influent CV concentration and decrease of flow rate. Adam-Bohart, Thomas and Yoon-Nelson models were used to predict the breakthrough curve and to evaluate the model parameters. Out of these models, Thomas and Yoon-Nelson models well described the experimental data. Therefore, the result implies that nZVI-SS biocomposite is a cheap and most promising biosorbent for the removal of CV from wastewater.Keywords: algae, biosorption, zero-valent, dye, wastewater
Procedia PDF Downloads 1937209 Transition from Linear to Circular Business Models with Service Design Methodology
Authors: Minna-Maari Harmaala, Hanna Harilainen
Abstract:
Estimates of the economic value of transitioning to circular economy models vary but it has been estimated to represent $1 trillion worth of new business into the global economy. In Europe alone, estimates claim that adopting circular-economy principles could not only have environmental and social benefits but also generate a net economic benefit of €1.8 trillion by 2030. Proponents of a circular economy argue that it offers a major opportunity to increase resource productivity, decrease resource dependence and waste, and increase employment and growth. A circular system could improve competitiveness and unleash innovation. Yet, most companies are not capturing these opportunities and thus the even abundant circular opportunities remain uncaptured even though they would seem inherently profitable. Service design in broad terms relates to developing an existing or a new service or service concept with emphasis and focus on the customer experience from the onset of the development process. Service design may even mean starting from scratch and co-creating the service concept entirely with the help of customer involvement. Service design methodologies provide a structured way of incorporating customer understanding and involvement in the process of designing better services with better resonance to customer needs. A business model is a depiction of how the company creates, delivers, and captures value; i.e. how it organizes its business. The process of business model development and adjustment or modification is also called business model innovation. Innovating business models has become a part of business strategy. Our hypothesis is that in addition to linear models still being easier to adopt and often with lower threshold costs, companies lack an understanding of how circular models can be adopted into their business and how customers will be willing and ready to adopt the new circular business models. In our research, we use robust service design methodology to develop circular economy solutions with two case study companies. The aim of the process is to not only develop the service concepts and portfolio, but to demonstrate the willingness to adopt circular solutions exists in the customer base. In addition to service design, we employ business model innovation methods to develop, test, and validate the new circular business models further. The results clearly indicate that amongst the customer groups there are specific customer personas that are willing to adopt and in fact are expecting the companies to take a leading role in the transition towards a circular economy. At the same time, there is a group of indifferents, to whom the idea of circularity provides no added value. In addition, the case studies clearly show what changes adoption of circular economy principles brings to the existing business model and how they can be integrated.Keywords: business model innovation, circular economy, circular economy business models, service design
Procedia PDF Downloads 1337208 Antioxidative Maillard Reaction Products Derived from Gelatin Hydrolysate of Unicorn Leatherjacket Skin
Authors: Supatra Karnjanapratum, Soottawat Benjakul
Abstract:
Gelatin hydrolysate, especially from marine resource, has been known to possess antioxidative activity. Nevertheless, the activity is still lower in comparison with the commercially available antioxidant. Maillard reactions can be use to increase antioxidative activity of gelatin hydrolysate, in which the numerous amino group could be involved in glycation. In the present study, gelatin hydrolysate (GH) from unicorn leatherjacket skin prepared using glycyl endopeptidase with prior autolysis assisted process was used for preparation of Maillard reaction products (MRPs) under dry condition. The impacts of different factors including, types of saccharides, GH to saccharide ratio, incubation temperatures, relative humidity (RH) and times on antioxidative activity of MRPs were investigated. MRPs prepared using the mixture of GH and galactose showed the highest antioxidative activity as determined by both ABTS radical scavenging activity and ferric reducing antioxidant power during heating (0-48 h) at 60 °C with 65% RH, compared with those derived from other saccharide tested. GH to galactose ratio at 2:1 (w/w) yielded the MRPs with the highest antioxidative activity, followed by the ratios of 1:1 and 1:2, respectively. When the effects of incubation temperatures (50, 60, 70 °C) and RH (55, 65, 75%) were examined, the highest browning index and the absorbance at 280 nm were found at 70 °C, regardless of RH. The pH and free amino group content of MRPs were decreased with the concomitant increase in antioxidative activity as the reaction time increased. Antioxidative activity of MRPs generally increased with increasing temperature and the highest antioxidative activity was found when RH of 55% was used. Based on electrophoresis of MRP, the polymerization along with the formation of high molecular weight material was observed. The optimal condition for preparing antioxidative MRPs was heating the mixture of GH and galactose (2:1) at 70 °C and 55% RH for 36 h. Therefore, antioxidative activity of GH was improved by Maillard reaction and the resulting MRP could be used as natural antioxidant in food products.Keywords: antioxidative activity, gelatin hydrolysate, maillard reaction, unicorn leatherjacket
Procedia PDF Downloads 2477207 Numerical Study of the Influence of the Primary Stream Pressure on the Performance of the Ejector Refrigeration System Based on Heat Exchanger Modeling
Authors: Elhameh Narimani, Mikhail Sorin, Philippe Micheau, Hakim Nesreddine
Abstract:
Numerical models of the heat exchangers in ejector refrigeration system (ERS) were developed and validated with the experimental data. The models were based on the switched heat exchangers model using the moving boundary method, which were capable of estimating the zones’ lengths, the outlet temperatures of both sides and the heat loads at various experimental points. The developed models were utilized to investigate the influence of the primary flow pressure on the performance of an R245fa ERS based on its coefficient of performance (COP) and exergy efficiency. It was illustrated numerically and proved experimentally that increasing the primary flow pressure slightly reduces the COP while the exergy efficiency goes through a maximum before decreasing.Keywords: Coefficient of Performance, COP, Ejector Refrigeration System, ERS, exergy efficiency (ηII), heat exchangers modeling, moving boundary method
Procedia PDF Downloads 1997206 Correction Factors for Soil-Structure Interaction Predicted by Simplified Models: Axisymmetric 3D Model versus Fully 3D Model
Authors: Fu Jia
Abstract:
The effects of soil-structure interaction (SSI) are often studied using axial-symmetric three-dimensional (3D) models to avoid the high computational cost of the more realistic, fully 3D models, which require 2-3 orders of magnitude more computer time and storage. This paper analyzes the error and presents correction factors for system frequency, system damping, and peak amplitude of structural response computed by axisymmetric models, embedded in uniform or layered half-space. The results are compared with those for fully 3D rectangular foundations of different aspect ratios. Correction factors are presented for a range of the model parameters, such as fixed-base frequency, structure mass, height and length-to-width ratio, foundation embedment, soil-layer stiffness and thickness. It is shown that the errors are larger for stiffer, taller and heavier structures, deeper foundations and deeper soil layer. For example, for a stiff structure like Millikan Library (NS response; length-to-width ratio 1), the error is 6.5% in system frequency, 49% in system damping and 180% in peak amplitude. Analysis of a case study shows that the NEHRP-2015 provisions for reduction of base shear force due to SSI effects may be unsafe for some structures and need revision. The presented correction factor diagrams can be used in practical design and other applications.Keywords: 3D soil-structure interaction, correction factors for axisymmetric models, length-to-width ratio, NEHRP-2015 provisions for reduction of base shear force, rectangular embedded foundations, SSI system frequency, SSI system damping
Procedia PDF Downloads 2647205 Multi-Index Performance Investigation of Rubberized Reclaimed Asphalt Mixture
Authors: Ling Xu, Giuseppe Loprencipe, Antonio D'Andrea
Abstract:
Asphalt pavement with recycled and sustainable materials has become the most commonly adopted strategy for road construction, including reclaimed asphalt pavement (RAP) and crumb rubber (CR) from waste tires. However, the adhesion and cohesion characteristics of rubberized reclaimed asphalt pavement were still ambiguous, resulting in deteriorated adhesion behavior and life performance. This research investigated the effect of bonding characteristics on rutting resistance and moisture susceptibility of rubberized reclaimed asphalt pavement in terms of two RAP sources with different oxidation levels and two tire rubber with different particle sizes. Firstly, the binder bond strength (BBS) test and bonding failure distinguishment were conducted to analyze the surface behaviors of binder-aggregate interaction. Then, the compatibility and penetration grade of rubberized RAP binder were evaluated by rotational viscosity test and penetration test, respectively. Hamburg wheel track (HWT) test with high-temperature viscoelastic deformation analysis was adopted, which illustrated the rutting resistance. Additionally, a water boiling test was employed to evaluate the moisture susceptibility of the mixture and the texture features were characterized with the statistical parameters of image colors. Finally, the colloid structure model of rubberized RAP binder with surface interaction was proposed, and statistical analysis was established to release the correlation among various indexes. This study concluded that the gel-phase colloid structure and molecular diffusion of the free light fraction would affect the surface interpretation with aggregate, determining the bonding characteristic of rubberized RAP asphalt.Keywords: bonding characteristics, reclaimed asphalt pavement, rubberized asphalt, sustainable material
Procedia PDF Downloads 607204 Modeling of Induced Voltage in Disconnected Grounded Conductor of Three-Phase Power Line
Authors: Misho Matsankov, Stoyan Petrov
Abstract:
The paper presents the methodology and the obtained mathematical models for determining the value of the grounding resistance of a disconnected conductor in a three-phase power line, for which the contact voltage is safe, by taking into account the potentials, induced by the non-disconnected phase conductors. The mathematical models have been obtained by implementing the experimental design techniques.Keywords: contact voltage, experimental design, induced voltage, safety
Procedia PDF Downloads 1747203 Practical Skill Education for Doctors in Training: Economical and Efficient Methods for Students to Receive Hands-on Experience
Authors: Nathaniel Deboever, Malcolm Breeze, Adrian Sheen
Abstract:
Basic surgical and suturing techniques are a fundamental requirement for all doctors. In order to gain confidence and competence, doctors in training need to obtain sufficient teaching and just as importantly: practice. Young doctors with an apt level of expertise on these simple surgical skills, which are often used in the Emergency Department, can help alleviate some pressure during a busy evening. Unfortunately, learning these skills can be quite difficult during medical school or even during junior doctor years. The aim of this project was to adequately train medical students attending University of Sydney’s Nepean Clinical School through a series of workshops highlighting practical skills, with hopes to further extend this program to junior doctors in the hospital. The sessions instructed basic skills via tutorials, demonstrations, and lastly, the sessions cemented these proficiencies with practical sessions. During such an endeavor, it is fundamental to employ models that appropriately resemble what students will encounter in the clinical setting. The sustainability of workshops is similarly important to the continuity of such a program. To address both these challenges, the authors have developed models including suturing platforms, knot tying, and vessel ligation stations, as well as a shave and punch biopsy models and ophthalmologic foreign body device. The unique aspect of this work is that we utilized hands-on teaching sessions, to address a gap in doctors-in-training and junior doctor curriculum. Presented to you through this poster are our approaches to creating models that do not employ animal products and therefore do not necessitate particular facilities or discarding requirements. Covering numerous skills that would be beneficial to all young doctors, these models are easily replicable and affordable. This exciting work allows for countless sessions at low cost, providing enough practice for students to perform these skills confidently as it has been shown through attendee questionnaires.Keywords: medical education, surgical models, surgical simulation, surgical skills education
Procedia PDF Downloads 1557202 Evaluation of Gesture-Based Password: User Behavioral Features Using Machine Learning Algorithms
Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier
Abstract:
Graphical-based passwords have existed for decades. Their major advantage is that they are easier to remember than an alphanumeric password. However, their disadvantage (especially recognition-based passwords) is the smaller password space, making them more vulnerable to brute force attacks. Graphical passwords are also highly susceptible to the shoulder-surfing effect. The gesture-based password method that we developed is a grid-free, template-free method. In this study, we evaluated the gesture-based passwords for usability and vulnerability. The results of the study are significant. We developed a gesture-based password application for data collection. Two modes of data collection were used: Creation mode and Replication mode. In creation mode (Session 1), users were asked to create six different passwords and reenter each password five times. In replication mode, users saw a password image created by some other user for a fixed duration of time. Three different duration timers, such as 5 seconds (Session 2), 10 seconds (Session 3), and 15 seconds (Session 4), were used to mimic the shoulder-surfing attack. After the timer expired, the password image was removed, and users were asked to replicate the password. There were 74, 57, 50, and 44 users participated in Session 1, Session 2, Session 3, and Session 4 respectfully. In this study, the machine learning algorithms have been applied to determine whether the person is a genuine user or an imposter based on the password entered. Five different machine learning algorithms were deployed to compare the performance in user authentication: namely, Decision Trees, Linear Discriminant Analysis, Naive Bayes Classifier, Support Vector Machines (SVMs) with Gaussian Radial Basis Kernel function, and K-Nearest Neighbor. Gesture-based password features vary from one entry to the next. It is difficult to distinguish between a creator and an intruder for authentication. For each password entered by the user, four features were extracted: password score, password length, password speed, and password size. All four features were normalized before being fed to a classifier. Three different classifiers were trained using data from all four sessions. Classifiers A, B, and C were trained and tested using data from the password creation session and the password replication with a timer of 5 seconds, 10 seconds, and 15 seconds, respectively. The classification accuracies for Classifier A using five ML algorithms are 72.5%, 71.3%, 71.9%, 74.4%, and 72.9%, respectively. The classification accuracies for Classifier B using five ML algorithms are 69.7%, 67.9%, 70.2%, 73.8%, and 71.2%, respectively. The classification accuracies for Classifier C using five ML algorithms are 68.1%, 64.9%, 68.4%, 71.5%, and 69.8%, respectively. SVMs with Gaussian Radial Basis Kernel outperform other ML algorithms for gesture-based password authentication. Results confirm that the shorter the duration of the shoulder-surfing attack, the higher the authentication accuracy. In conclusion, behavioral features extracted from the gesture-based passwords lead to less vulnerable user authentication.Keywords: authentication, gesture-based passwords, machine learning algorithms, shoulder-surfing attacks, usability
Procedia PDF Downloads 1027201 Aerodynamic Investigation of Rear Vehicle by Geometry Variations on the Backlight Angle
Authors: Saud Hassan
Abstract:
This paper shows simulation for the prediction of the flow around the backlight angle of the passenger vehicle. The CFD simulations are carried out on different car models. The Ahmed model “bluff body” used as the stander model to study aerodynamics of the backlight angle. This paper described the airflow over the different car models with different backlight angles and also on the Ahmed model to determine the trailing vortices with the varying backlight angle of a passenger vehicle body. The CFD simulation is carried out with the Ahmed body which has simplified car model mainly used in automotive industry to investigate the flow over the car body surface. The main goal of the simulation is to study the behavior of trailing vortices of these models. In this paper the air flow over the slant angle of 0,5o, 12.5o, 20o, 30o, 40o are considered. As investigating on the rear backlight angle two dimensional flows occurred at the rear slant, on the other hand when the slant angle is 30o the flow become three dimensional. Above this angle sudden drop occurred in drag.Keywords: aerodynamics, Ahemd vehicle , backlight angle, finite element method
Procedia PDF Downloads 7797200 Recurrent Neural Networks for Complex Survival Models
Authors: Pius Marthin, Nihal Ata Tutkun
Abstract:
Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)
Procedia PDF Downloads 887199 Machine Learning for Classifying Risks of Death and Length of Stay of Patients in Intensive Unit Care Beds
Authors: Itamir de Morais Barroca Filho, Cephas A. S. Barreto, Ramon Malaquias, Cezar Miranda Paula de Souza, Arthur Costa Gorgônio, João C. Xavier-Júnior, Mateus Firmino, Fellipe Matheus Costa Barbosa
Abstract:
Information and Communication Technologies (ICT) in healthcare are crucial for efficiently delivering medical healthcare services to patients. These ICTs are also known as e-health and comprise technologies such as electronic record systems, telemedicine systems, and personalized devices for diagnosis. The focus of e-health is to improve the quality of health information, strengthen national health systems, and ensure accessible, high-quality health care for all. All the data gathered by these technologies make it possible to help clinical staff with automated decisions using machine learning. In this context, we collected patient data, such as heart rate, oxygen saturation (SpO2), blood pressure, respiration, and others. With this data, we were able to develop machine learning models for patients’ risk of death and estimate the length of stay in ICU beds. Thus, this paper presents the methodology for applying machine learning techniques to develop these models. As a result, although we implemented these models on an IoT healthcare platform, helping clinical staff in healthcare in an ICU, it is essential to create a robust clinical validation process and monitoring of the proposed models.Keywords: ICT, e-health, machine learning, ICU, healthcare
Procedia PDF Downloads 1077198 Daily Probability Model of Storm Events in Peninsular Malaysia
Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain
Abstract:
Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.Keywords: daily probability model, monsoon seasons, regions, storm events
Procedia PDF Downloads 3417197 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing
Authors: Tolulope Aremu
Abstract:
This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving
Procedia PDF Downloads 257196 Optimization of Quercus cerris Bark Liquefaction
Authors: Luísa P. Cruz-Lopes, Hugo Costa e Silva, Idalina Domingos, José Ferreira, Luís Teixeira de Lemos, Bruno Esteves
Abstract:
The liquefaction process of cork based tree barks has led to an increase of interest due to its potential innovation in the lumber and wood industries. In this particular study the bark of Quercus cerris (Turkish oak) is used due to its appreciable amount of cork tissue, although of inferior quality when compared to the cork provided by other Quercus trees. This study aims to optimize alkaline catalysis liquefaction conditions, regarding several parameters. To better comprehend the possible chemical characteristics of the bark of Quercus cerris, a complete chemical analysis was performed. The liquefaction process was performed in a double-jacket reactor heated with oil, using glycerol and a mixture of glycerol/ethylene glycol as solvents, potassium hydroxide as a catalyst, and varying the temperature, liquefaction time and granulometry. Due to low liquefaction efficiency resulting from the first experimental procedures a study was made regarding different washing techniques after the filtration process using methanol and methanol/water. The chemical analysis stated that the bark of Quercus cerris is mostly composed by suberin (ca. 30%) and lignin (ca. 24%) as well as insolvent hemicelluloses in hot water (ca. 23%). On the liquefaction stage, the results that led to higher yields were: using a mixture of methanol/ethylene glycol as reagents and a time and temperature of 120 minutes and 200 ºC, respectively. It is concluded that using a granulometry of <80 mesh leads to better results, even if this parameter barely influences the liquefaction efficiency. Regarding the filtration stage, washing the residue with methanol and then distilled water leads to a considerable increase on final liquefaction percentages, which proves that this procedure is effective at liquefying suberin content and lignocellulose fraction.Keywords: liquefaction, Quercus cerris, polyalcohol liquefaction, temperature
Procedia PDF Downloads 331