Search results for: conditional proportional reversed hazard rate model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23564

Search results for: conditional proportional reversed hazard rate model

16454 Virtual Reality and Avatars in Education

Authors: Michael Brazley

Abstract:

Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.

Keywords: virtual reality, avatars, education, XR

Procedia PDF Downloads 86
16453 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model

Authors: Yepeng Cheng, Yasuhiko Morimoto

Abstract:

Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.

Keywords: customer value, Huff's Gravity Model, POS, Retailer

Procedia PDF Downloads 108
16452 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN

Authors: Muhammad Atif, Cang Yan

Abstract:

The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.

Keywords: low light image enhancement, deep learning, convolutional neural network, image processing

Procedia PDF Downloads 49
16451 Preventative Maintenance, Impact on the Optimal Replacement Strategy of Secondhand Products

Authors: Pin-Wei Chiang, Wen-Liang Chang, Ruey-Huei Yeh

Abstract:

This paper investigates optimal replacement and preventative maintenance policies of secondhand products under a Finite Planning Horizon (FPH). Any consumer wishing to replace their product under FPH would have it undergo minimal repairs. The replacement provided would be required to undergo periodical preventive maintenance done to avoid product failure. Then, a mathematical formula for disbursement cost for products under FPH can be derived. Optimal policies are then obtained to minimize cost. In the first of two segments of the paper, a model for initial product purchase of either new or secondhand products is used. This model is built by analyzing product purchasing price, surplus value of product, as well as the minimal repair cost. The second segment uses a model for replacement products, which are also secondhand products with no limit on usage. This model analyzes the same components as the first as well as expected preventative maintenance cost. Using these two models, a formula for the expected final total cost can be developed. The formula requires four variables (optimal preventive maintenance level, preventive maintenance frequency, replacement timing, age of replacement product) to find minimal cost requirement. Based on analysis of the variables using the expected total final cost model, it was found that the purchasing price and length of ownership were directly related. Also, consumers should choose the secondhand product with the higher usage for replacement. Products with higher initial usage upon acquisition require an earlier replacement schedule. In this case, replacements should be made with a secondhand product with less usage. In addition, preventative maintenance also significantly reduces cost. Consumers that plan to use products for longer periods of time replace their products later. Hence these consumers should choose the secondhand product with lesser initial usage for replacement. Preventative maintenance also creates significant total cost savings in this case. This study provides consumers with a method of calculating both the ideal amount of usage of the products they should purchase as well as the frequency and level of preventative maintenance that should be conducted in order to minimize cost and maintain product function.

Keywords: finite planning horizon, second hand product, replacement, preventive maintenance, minimal repair

Procedia PDF Downloads 462
16450 Estimation of Scour Using a Coupled Computational Fluid Dynamics and Discrete Element Model

Authors: Zeinab Yazdanfar, Dilan Robert, Daniel Lester, S. Setunge

Abstract:

Scour has been identified as the most common threat to bridge stability worldwide. Traditionally, scour around bridge piers is calculated using the empirical approaches that have considerable limitations and are difficult to generalize. The multi-physic nature of scouring which involves turbulent flow, soil mechanics and solid-fluid interactions cannot be captured by simple empirical equations developed based on limited laboratory data. These limitations can be overcome by direct numerical modeling of coupled hydro-mechanical scour process that provides a robust prediction of bridge scour and valuable insights into the scour process. Several numerical models have been proposed in the literature for bridge scour estimation including Eulerian flow models and coupled Euler-Lagrange models incorporating an empirical sediment transport description. However, the contact forces between particles and the flow-particle interaction haven’t been taken into consideration. Incorporating collisional and frictional forces between soil particles as well as the effect of flow-driven forces on particles will facilitate accurate modeling of the complex nature of scour. In this study, a coupled Computational Fluid Dynamics and Discrete Element Model (CFD-DEM) has been developed to simulate the scour process that directly models the hydro-mechanical interactions between the sediment particles and the flowing water. This approach obviates the need for an empirical description as the fundamental fluid-particle, and particle-particle interactions are fully resolved. The sediment bed is simulated as a dense pack of particles and the frictional and collisional forces between particles are calculated, whilst the turbulent fluid flow is modeled using a Reynolds Averaged Navier Stocks (RANS) approach. The CFD-DEM model is validated against experimental data in order to assess the reliability of the CFD-DEM model. The modeling results reveal the criticality of particle impact on the assessment of scour depth which, to the authors’ best knowledge, hasn’t been considered in previous studies. The results of this study open new perspectives to the scour depth and time assessment which is the key to manage the failure risk of bridge infrastructures.

Keywords: bridge scour, discrete element method, CFD-DEM model, multi-phase model

Procedia PDF Downloads 116
16449 Traffic Safety and Risk Assessment Model by Analysis of Questionnaire Survey: A Case Study of S. G. Highway, Ahmedabad, India

Authors: Abhijitsinh Gohil, Kaushal Wadhvaniya, Kuldipsinh Jadeja

Abstract:

Road Safety is a multi-sectoral and multi-dimensional issue. An effective model can assess the risk associated with highway safety. A questionnaire survey is very essential to identify the events or activities which are causing unsafe condition for traffic on an urban highway. A questionnaire of standard questions including vehicular, human and infrastructure characteristics can be made. Responses from the age wise group of road users can be taken on field. Each question or an event holds a specific risk weightage, which contributes in creating an inappropriate and unsafe flow of traffic. The probability of occurrence of an event can be calculated from the data collected from the road users. Finally, the risk score can be calculated by considering the risk factor and the probability of occurrence of individual event and addition of all risk score for the individual event will give the total risk score of a particular road. Standards for risk score can be made and total risk score can be compared with the standards. Thus road can be categorized based on risk associated and traffic safety on it. With this model, one can assess the need for traffic safety improvement on a given road, and qualitative data can be analysed.

Keywords: probability of occurrence, questionnaire, risk factor, risk score

Procedia PDF Downloads 327
16448 Multiscale Model of Blast Explosion Human Injury Biomechanics

Authors: Raj K. Gupta, X. Gary Tan, Andrzej Przekwas

Abstract:

Bomb blasts from Improvised Explosive Devices (IEDs) account for vast majority of terrorist attacks worldwide. Injuries caused by IEDs result from a combination of the primary blast wave, penetrating fragments, and human body accelerations and impacts. This paper presents a multiscale computational model of coupled blast physics, whole human body biodynamics and injury biomechanics of sensitive organs. The disparity of the involved space- and time-scales is used to conduct sequential modeling of an IED explosion event, CFD simulation of blast loads on the human body and FEM modeling of body biodynamics and injury biomechanics. The paper presents simulation results for blast-induced brain injury coupling macro-scale brain biomechanics and micro-scale response of sensitive neuro-axonal structures. Validation results on animal models and physical surrogates are discussed. Results of our model can be used to 'replicate' filed blast loadings in laboratory controlled experiments using animal models and in vitro neuro-cultures.

Keywords: blast waves, improvised explosive devices, injury biomechanics, mathematical models, traumatic brain injury

Procedia PDF Downloads 237
16447 Improvement of Soft Clay Using Floating Cement Dust-Lime Columns

Authors: Adel Belal, Sameh Aboelsoud, Mohy Elmashad, Mohammed Abdelmonem

Abstract:

The two main criteria that control the design and performance of footings are bearing capacity and settlement of soil. In soft soils, the construction of buildings, storage tanks, warehouse, etc. on weak soils usually involves excessive settlement problems. To solve bearing capacity or reduce settlement problems, soil improvement may be considered by using different techniques, including encased cement dust–lime columns. The proposed research studies the effect of adding floating encased cement dust and lime mix columns to soft clay on the clay-bearing capacity. Four experimental tests were carried out. Columns diameters of 3.0 cm, 4.0 cm, and 5.0 cm and columns length of 60% of the clay layer thickness were used. Numerical model was constructed and verified using commercial finite element package (PLAXIS 2D, V8.5). The verified model was used to study the effect of distributing columns around the footing at different distances. The study showed that the floating cement dust lime columns enhanced the clay-bearing capacity with 262%. The numerical model showed that the columns around the footing have a limit effect on the clay improvement.

Keywords: bearing capacity, cement dust – lime columns, ground improvement, soft clay

Procedia PDF Downloads 186
16446 Supplemental VisCo-friction Damping for Dynamical Structural Systems

Authors: Sharad Singh, Ajay Kumar Sinha

Abstract:

Coupled dampers like viscoelastic-frictional dampers for supplemental damping are a newer technique. In this paper, innovative Visco-frictional damping models have been presented and investigated. This paper attempts to couple frictional and fluid viscous dampers into a single unit of supplemental dampers. Visco-frictional damping model is developed by series and parallel coupling of frictional and fluid viscous dampers using Maxwell and Kelvin-Voigat models. The time analysis has been performed using numerical simulation on an SDOF system with varying fundamental periods, subject to a set of 12 ground motions. The simulation was performed using the direct time integration method. MATLAB programming tool was used to carry out the numerical simulation. The response behavior has been analyzed for the varying time period and added damping. This paper compares the response reduction behavior of the two modes of coupling. This paper highlights the performance efficiency of the suggested damping models. It also presents a mathematical modeling approach to visco-frictional dampers and simultaneously suggests the suitable mode of coupling between the two sub-units.

Keywords: hysteretic damping, Kelvin model, Maxwell model, parallel coupling, series coupling, viscous damping

Procedia PDF Downloads 146
16445 Fuzzy Logic Based Ventilation for Controlling Harmful Gases in Livestock Houses

Authors: Nuri Caglayan, H. Kursat Celik

Abstract:

There are many factors that influence the health and productivity of the animals in livestock production fields, including temperature, humidity, carbon dioxide (CO2), ammonia (NH3), hydrogen sulfide (H2S), physical activity and particulate matter. High NH3 concentrations reduce feed consumption and cause daily weight gain. At high concentrations, H2S causes respiratory problems and CO2 displace oxygen, which can cause suffocation or asphyxiation. Good air quality in livestock facilities can have an impact on the health and well-being of animals and humans. Air quality assessment basically depends on strictly given limits without taking into account specific local conditions between harmful gases and other meteorological factors. The stated limitations may be eliminated. using controlling systems based on neural networks and fuzzy logic. This paper describes a fuzzy logic based ventilation algorithm, which can calculate different fan speeds under pre-defined boundary conditions, for removing harmful gases from the production environment. In the paper, a fuzzy logic model has been developed based on a Mamedani’s fuzzy method. The model has been built on MATLAB software. As the result, optimum fan speeds under pre-defined boundary conditions have been presented.

Keywords: air quality, fuzzy logic model, livestock housing, fan speed

Procedia PDF Downloads 358
16444 Effective Charge Coupling in Low Dimensional Doped Quantum Antiferromagnets

Authors: Suraka Bhattacharjee, Ranjan Chaudhury

Abstract:

The interaction between the charge degrees of freedom for itinerant antiferromagnets is investigated in terms of generalized charge stiffness constant corresponding to nearest neighbour t-J model and t1-t2-t3-J model. The low dimensional hole doped antiferromagnets are the well known systems that can be described by the t-J-like models. Accordingly, we have used these models to investigate the fermionic pairing possibilities and the coupling between the itinerant charge degrees of freedom. A detailed comparison between spin and charge couplings highlights that the charge and spin couplings show very similar behaviour in the over-doped region, whereas, they show completely different trends in the lower doping regimes. Moreover, a qualitative equivalence between generalized charge stiffness and effective Coulomb interaction is also established based on the comparisons with other theoretical and experimental results. Thus it is obvious that the enhanced possibility of fermionic pairing is inherent in the reduction of Coulomb repulsion with increase in doping concentration. However, the increased possibility can not give rise to pairing without the presence of any other pair producing mechanism outside the t-J model. Therefore, one can conclude that the t-J-like models themselves solely are not capable of producing conventional momentum-based superconducting pairing on their own.

Keywords: generalized charge stiffness constant, charge coupling, effective Coulomb interaction, t-J-like models, momentum-space pairing

Procedia PDF Downloads 146
16443 Developing a Cybernetic Model of Interdepartmental Logistic Interactions in SME

Authors: Jonas Mayer, Kai-Frederic Seitz, Thorben Kuprat

Abstract:

In today’s competitive environment production’s logistic objectives such as ‘delivery reliability’ and ‘delivery time’ and distribution’s logistic objectives such as ‘service level’ and ‘delivery delay’ are attributed great importance. Especially for small and mid-sized enterprises (SME) attaining these objectives pose a key challenge. Within this context, one of the difficulties is that interactions between departments within the enterprise and their specific objectives are insufficiently taken into account and aligned. Interdepartmental independencies along with contradicting targets set within the different departments result in enterprises having sub-optimal logistic performance capability. This paper presents a research project which will systematically describe the interactions between departments and convert them into a quantifiable form.

Keywords: department-specific actuating and control variables, interdepartmental interactions, cybernetic model, logistic objectives

Procedia PDF Downloads 359
16442 Development of Tutorial Courseware on Selected Topics in Mathematics, Science and the English Language

Authors: Alice D. Dioquino, Olivia N. Buzon, Emilio F. Aguinaldo, Ruel Avila, Erwin R. Callo, Cristy Ocampo, Malvin R. Tabajen, Marla C. Papango, Marilou M. Ubina, Josephine Tondo, Cromwell L. Valeriano

Abstract:

The main purpose of this study was to develop, evaluate and validate courseware on Selected Topics in Mathematics, Science, and the English Language. Specifically, it aimed to: 1. Identify the appropriate Instructional Systems Design (ISD) model in the development of the courseware material; 2. Assess the courseware material according to its: a. Content Characteristics; b. Instructional Characteristics; and c. Technical Characteristics 3. Find out if there is a significant difference in the performance of students before and after using the tutorial CAI. This research is developmental as well as a one group pretest-posttest design. The study had two phases. Phase I includes the needs analysis, writing of lessons and storyboard by the respective experts in each field. Phase II includes the digitization or the actual development of the courseware by the faculty of the ICT department. In this phase it adapted an instructional systems design (ISD) model which is the ADDIE model. ADDIE stands for Analysis, Design, Development, Implementation and Evaluation. Formative evaluation was conducted simultaneously with the different phases to detect and remedy any bugs in the courseware along the areas of content, instructional and technical characteristics. The expected output are the digitized lessons in Algebra, Biology, Chemistry, Physics and Communication Arts in English. Students and some IT experts validated the CAI material using the Evaluation Form by Wong & Wong. They validated the CAI materials as Highly Acceptable with an overall mean rating of 4.527and standard deviation of 0 which means that they were one in the ratings they have given the CAI materials. A mean gain was recorded and computing the t-test for dependent samples it showed that there were significant differences in the mean achievement of the students before and after the treatment (using CAI). The identified ISD model used in the development of the tutorial courseware was the ADDIE model. The quantitative analyses of data based on ratings given by the respondents’ shows that the tutorial courseware possess the characteristics and or qualities of a very good computer-based courseware. The ratings given by the different evaluators with regard to content, instructional, and technical aspects of the Tutorial Courseware are in conformity towards being excellent. Students performed better in mathematics, biology chemistry, physics and the English Communication Arts after they were exposed to the tutorial courseware.

Keywords: CAI, tutorial courseware, Instructional Systems Design (ISD) Model, education

Procedia PDF Downloads 335
16441 Acceptance of Health Information Application in Smart National Identity Card (SNIC) Using a New I-P Framework

Authors: Ismail Bile Hassan, Masrah Azrifah Azmi Murad

Abstract:

This study discovers a novel framework of individual level technology adoption known as I-P (Individual- Privacy) towards Smart National Identity Card health information application. Many countries introduced smart national identity card (SNIC) with various applications such as health information application embedded inside it. However, the degree to which citizens accept and use some of the embedded applications in smart national identity remains unknown to many governments and application providers as well. Moreover, the previous studies revealed that the factors of trust, perceived risk, privacy concern and perceived credibility need to be incorporated into more comprehensive models such as extended Unified Theory of Acceptance and Use of Technology known as UTAUT2. UTAUT2 is a mainly widespread and leading theory existing in the information system literature up to now. This research identifies factors affecting the citizens’ behavioural intention to use health information application embedded in SNIC and extends better understanding on the relevant factors that the government and the application providers would need to consider in predicting citizens’ new technology acceptance in the future. We propose a conceptual framework by combining the UTAUT2 and Privacy Calculus Model constructs and also adding perceived credibility as a new variable. The proposed framework may provide assistance to any government planning, decision, and policy makers involving e-government projects. The empirical study may be conducted in the future to provide proof and empirically validate this I-P framework.

Keywords: unified theory of acceptance and use of technology (UTAUT) model, UTAUT2 model, smart national identity card (SNIC), health information application, privacy calculus model (PCM)

Procedia PDF Downloads 450
16440 ESG and Corporate Financial Performance: Empirical Evidence from Vietnam’s Listed Construction Companies

Authors: My Linh Hoang, Van Dung Hoang

Abstract:

Environmental, Social, and Governance (ESG) factors have become a focus for companies globally, as businesses are now focusing on long-term sustainable goals rather than only operating for the goals of profit maximization. According to recent research, in several countries, companies have shown positive results in their financial performance by improving their ESG performance. The construction industry is one of the most crucial components of social and economic development; as a result, considerations for ESG factors are becoming more and more essential for companies in this sector. In Vietnam, the construction industry has been growing rapidly in recent years; however, it has yet to be discussed and studied extensively in Vietnam how ESG factors create impacts on corporate financial performance in general and construction corporations’ financial performance in particular. This research aims to examine the relationship between ESG factors and financial indicators in construction companies from 2011 to 2021 through panel data analysis of 75 listed construction companies in Vietnam and to provide insights into how these companies can better integrate ESG considerations into their operations to enhance their financial performance. The data was analyzed through 3 main methods: descriptive statistics, correlation coefficient analysis applied to all dependent, explanatory and control variables, and panel data analysis method. In panel data analysis, the study uses the fixed effects model (FEM) and random effects model (REM). The Hausman test will be used to select which model is suitable to be used. The findings indicate that maintaining a strong commitment to ESG principles can have a positive impact on financial performance. Finally, FGLS estimation will be performed when the problem of autocorrelation and variable variance appears in the model. This is significant for all parties involved, including investors, company managers, decision-makers, and industry regulators.

Keywords: ESG, financial performance, construction company, Vietnam

Procedia PDF Downloads 69
16439 A Mathematical Model for Studying Landing Dynamics of a Typical Lunar Soft Lander

Authors: Johns Paul, Santhosh J. Nalluveettil, P. Purushothaman, M. Premdas

Abstract:

Lunar landing is one of the most critical phases of lunar mission. The lander is provided with a soft landing system to prevent structural damage of lunar module by absorbing the landing shock and also assure stability during landing. Presently available software are not capable to simulate the rigid body dynamics coupled with contact simulation and elastic/plastic deformation analysis. Hence a separate mathematical model has been generated for studying the dynamics of a typical lunar soft lander. Parameters used in the analysis includes lunar surface slope, coefficient of friction, initial touchdown velocity (vertical and horizontal), mass and moment of inertia of lander, crushing force due to energy absorbing material in the legs, number of legs and geometry of lander. The mathematical model is capable to simulate plastic and elastic deformation of honey comb, frictional force between landing leg and lunar soil, surface contact simulation, lunar gravitational force, rigid body dynamics and linkage dynamics of inverted tripod landing gear. The non linear differential equations generated for studying the dynamics of lunar lander is solved by numerical method. Matlab programme has been used as a computer tool for solving the numerical equations. The position of each kinematic joint is defined by mathematical equations for the generation of equation of motion. All hinged locations are defined by position vectors with respect to body fixed coordinate. The vehicle rigid body rotations and motions about body coordinate are only due to the external forces and moments arise from footpad reaction force due to impact, footpad frictional force and weight of vehicle. All these force are mathematically simulated for the generation of equation of motion. The validation of mathematical model is done by two different phases. First phase is the validation of plastic deformation of crushable elements by employing conservation of energy principle. The second phase is the validation of rigid body dynamics of model by simulating a lander model in ADAMS software after replacing the crushable elements to elastic spring element. Simulation of plastic deformation along with rigid body dynamics and contact force cannot be modeled in ADAMS. Hence plastic element of primary strut is replaced with a spring element and analysis is carried out in ADAMS software. The same analysis is also carried out using the mathematical model where the simulation of honeycomb crushing is replaced by elastic spring deformation and compared the results with ADAMS analysis. The rotational motion of linkages and 6 degree of freedom motion of lunar Lander about its CG can be validated by ADAMS software by replacing crushing element to spring element. The model is also validated by the drop test results of 4 leg lunar lander. This paper presents the details of mathematical model generated and its validation.

Keywords: honeycomb, landing leg tripod, lunar lander, primary link, secondary link

Procedia PDF Downloads 332
16438 Developing a Maturity Model of Digital Twin Application for Infrastructure Asset Management

Authors: Qingqing Feng, S. Thomas Ng, Frank J. Xu, Jiduo Xing

Abstract:

Faced with unprecedented challenges including aging assets, lack of maintenance budget, overtaxed and inefficient usage, and outcry for better service quality from the society, today’s infrastructure systems has become the main focus of many metropolises to pursue sustainable urban development and improve resilience. Digital twin, being one of the most innovative enabling technologies nowadays, may open up new ways for tackling various infrastructure asset management (IAM) problems. Digital twin application for IAM, as its name indicated, represents an evolving digital model of intended infrastructure that possesses functions including real-time monitoring; what-if events simulation; and scheduling, maintenance, and management optimization based on technologies like IoT, big data and AI. Up to now, there are already vast quantities of global initiatives of digital twin applications like 'Virtual Singapore' and 'Digital Built Britain'. With digital twin technology permeating the IAM field progressively, it is necessary to consider the maturity of the application and how those institutional or industrial digital twin application processes will evolve in future. In order to deal with the gap of lacking such kind of benchmark, a draft maturity model is developed for digital twin application in the IAM field. Firstly, an overview of current smart cities maturity models is given, based on which the draft Maturity Model of Digital Twin Application for Infrastructure Asset Management (MM-DTIAM) is developed for multi-stakeholders to evaluate and derive informed decision. The process of development follows a systematic approach with four major procedures, namely scoping, designing, populating and testing. Through in-depth literature review, interview and focus group meeting, the key domain areas are populated, defined and iteratively tuned. Finally, the case study of several digital twin projects is conducted for self-verification. The findings of the research reveal that: (i) the developed maturity model outlines five maturing levels leading to an optimised digital twin application from the aspects of strategic intent, data, technology, governance, and stakeholders’ engagement; (ii) based on the case study, levels 1 to 3 are already partially implemented in some initiatives while level 4 is on the way; and (iii) more practices are still needed to refine the draft to be mutually exclusive and collectively exhaustive in key domain areas.

Keywords: digital twin, infrastructure asset management, maturity model, smart city

Procedia PDF Downloads 143
16437 Chemistry and Biological Activity of Feed Additive for Poultry Farming

Authors: Malkhaz Jokhadze, Vakhtang Mshvildadze, Levan Makaradze, Ekaterine Mosidze, Salome Barbaqadze, Mariam Murtazashvili, Dali Berashvili, Koba sivsivadze, Lasha Bakuridze, Aliosha Bakuridze

Abstract:

Essential oils are one of the most important groups of biologically active substances present in plants. Due to the chemical diversity of components, essential oils and their preparations have a wide spectrum of pharmacological action. They have bactericidal, antiviral, fungicidal, antiprotozoal, anti-inflammatory, spasmolytic, sedative and other activities. They are expectorant, spasmolytic, sedative, hypotensive, secretion enhancing, antioxidant remedies. Based on preliminary pharmacological studies, we have developed a formulation called “Phytobiotic” containing essential oils, a feed additive for poultry as an alternative to antibiotics. Phytobiotic is a water-soluble powder containing a composition of essential oils of thyme, clary, monarda and auxiliary substances: dry extract of liquorice and inhalation lactose. On this stage of research, the goal was to study the chemical composition of provided phytobiotic, identify the main substances and determine their quantity, investigate the biological activity of phytobiotic through in vitro and in vivo studies. Using gas chromatography-mass spectrometry, 38 components were identified in phytobiotic, representing acyclic-, monocyclic-, bicyclic-, and sesquiterpenes. Together with identification of main active substances, their quantitative content was determined, including acyclic terpene alcohol β-linalool, acyclic terpene ketone linalyl acetate, monocyclic terpenes: D-limonene and γ-terpinene, monocyclic aromatic terpene thymol. Provided phytobiotic has pronounced and at the same time broad spectrum of antibacterial activity. In the cell model, phytobiotic showed weak antioxidant activity, and it was stronger in the ORAC (chemical model) tests. Meanwhile anti-inflammatory activity was also observed. When fowls were supplied feed enriched with phytobiotic, it was observed that gained weight of the chickens in the experimental group exceeded the same data for the control group during the entire period of the experiment. The survival rate of broilers in the experimental group during the growth period was 98% compared to -94% in the control group. As a result of conducted researches probable four different mechanisms which are important for the action of phytobiotics were identified: sensory, metabolic, antioxidant and antibacterial action. General toxic, possible local irritant and allergenic effects of phytobiotic were also investigated. Performed assays proved that formulation is safe.

Keywords: clary, essential oils, monarda, poultry, phytobiotics, thyme

Procedia PDF Downloads 158
16436 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model

Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova

Abstract:

The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.

Keywords: bacteriocins, cross-contamination, mathematical model, temperature

Procedia PDF Downloads 127
16435 Weakly Non-Linear Stability Analysis of Newtonian Liquids and Nanoliquids in Shallow, Square and Tall High-Porosity Enclosures

Authors: Pradeep G. Siddheshwar, K. M. Lakshmi

Abstract:

The present study deals with weakly non-linear stability analysis of Rayleigh-Benard-Brinkman convection in nanoliquid-saturated porous enclosures. The modified-Buongiorno-Brinkman model (MBBM) is used for the conservation of linear momentum in a nanoliquid-saturated-porous medium under the assumption of Boussinesq approximation. Thermal equilibrium is imposed between the base liquid and the nanoparticles. The thermophysical properties of nanoliquid are modeled using phenomenological laws and mixture theory. The fifth-order Lorenz model is derived for the problem and is then reduced to the first-order Ginzburg-Landau equation (GLE) using the multi-scale method. The analytical solution of the GLE for the amplitude is then used to quantify the heat transport in closed form, in terms of the Nusselt number. It is found that addition of dilute concentration of nanoparticles significantly enhances the heat transport and the dominant reason for the same is the high thermal conductivity of the nanoliquid in comparison to that of the base liquid. This aspect of nanoliquids helps in speedy removal of heat. The porous medium serves the purpose of retainment of energy in the system due to its low thermal conductivity. The present model helps in making a unified study for obtaining the results for base liquid, nanoliquid, base liquid-saturated porous medium and nanoliquid-saturated porous medium. Three different types of enclosures are considered for the study by taking different values of aspect ratio, and it is observed that heat transport in tall porous enclosure is maximum while that of shallow is the least. Detailed discussion is also made on estimating heat transport for different volume fractions of nanoparticles. Results of single-phase model are shown to be a limiting case of the present study. The study is made for three boundary combinations, viz., free-free, rigid-rigid and rigid-free.

Keywords: Boungiorno model, Ginzburg-Landau equation, Lorenz equations, porous medium

Procedia PDF Downloads 311
16434 Modeling SET Effect on Charge Pump Phase Locked Loop

Authors: Varsha Prasad, S. Sandya

Abstract:

Cosmic Ray effects in microelectronics such as single event effect (SET) and total dose ionization (TID) have been of major concern in space electronics since 1970. Advanced CMOS technologies have demonstrated reduced sensitivity to TID effect. However, charge pump Phase Locked Loop is very much vulnerable to single event transient effect. This paper presents an SET analysis model, where the SET is modeled as a double exponential pulse. The time domain analysis reveals that the settling time of the voltage controlled oscillator (VCO) depends on the SET pulse strength, setting the time constant and the damping factor. The analysis of the proposed SET analysis model is confirmed by the simulation results.

Keywords: charge pump, phase locked loop, SET, VCO

Procedia PDF Downloads 424
16433 Diabetes Mellitus and Blood Glucose Variability Increases the 30-day Readmission Rate after Kidney Transplantation

Authors: Harini Chakkera

Abstract:

Background: Inpatient hyperglycemia is an established independent risk factor among several patient cohorts with hospital readmission. This has not been studied after kidney transplantation. Nearly one-third of patients who have undergone a kidney transplant reportedly experience 30-day readmission. Methods: Data on first-time solitary kidney transplantations were retrieved between September 2015 to December 2018. Information was linked to the electronic health record to determine a diagnosis of diabetes mellitus and extract glucometeric and insulin therapy data. Univariate logistic regression analysis and the XGBoost algorithm were used to predict 30-day readmission. We report the average performance of the models on the testing set on five bootstrapped partitions of the data to ensure statistical significance. Results: The cohort included 1036 patients who received kidney transplantation, and 224 (22%) experienced 30-day readmission. The machine learning algorithm was able to predict 30-day readmission with an average AUC of 77.3% (95% CI 75.30-79.3%). We observed statistically significant differences in the presence of pretransplant diabetes, inpatient-hyperglycemia, inpatient-hypoglycemia, and minimum and maximum glucose values among those with higher 30-day readmission rates. The XGBoost model identified the index admission length of stay, presence of hyper- and hypoglycemia and recipient and donor BMI values as the most predictive risk factors of 30-day readmission. Additionally, significant variations in the therapeutic management of blood glucose by providers were observed. Conclusions: Suboptimal glucose metrics during hospitalization after kidney transplantation is associated with an increased risk for 30-day hospital readmission. Optimizing the hospital blood glucose management, a modifiable factor, after kidney transplantation may reduce the risk of 30-day readmission.

Keywords: kidney, transplant, diabetes, insulin

Procedia PDF Downloads 69
16432 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 114
16431 The Dao of Political Economy - A Holistic Perspective

Authors: Tao Peng

Abstract:

This paper presents a holistic model of political economy based on Daoism – the foundational philosophy of classical Chinese epistemology. Daoism is both comprehensive and subtle in its manifestations and applications in all aspects of nature and society. Based on Daoist creation theory of the universe, life theory and five element functioning theory, a holistic model in economics with minimal assumptions and independent of ideology are constructed. Under this framework, different schools of economics, such as neo-liberal, Marxism, and Austrian school, are explored and shed new light on. Economic and financial predictions can be realized in applications to Qi Men Dun Jia. This framework can provide guidelines and inspirations to economic modelling, economic policies formulation and strategy development and guide society towards a more sustainable future.

Keywords: daoism, economics, holistic, philosophy

Procedia PDF Downloads 70
16430 Increasing Photosynthetic H2 Production by in vivo Expression of Re-Engineered Ferredoxin-Hydrogenase Fusion Protein in the Green Alga Chlamydomonas reinhardtii

Authors: Dake Xiong, Ben Hankamer, Ian Ross

Abstract:

The most urgent challenge of our time is to replace the depleting resources of fossil fuels by sustainable environmentally friendly alternatives. Hydrogen is a promising CO2-neutral fuel for a more sustainable future especially when produced photo-biologically. Hydrogen can be photosynthetically produced in unicellular green alga like Chlamydomonas reinhardtii, catalysed by the inducible highly active and bidirectional [FeFe]-hydrogenase enzymes (HydA). However, evolutionary and physiological constraints severely restrict the hydrogen yield of algae for industrial scale-up, mainly due to its competition among other metabolic pathways on photosynthetic electrons. Among them, a major challenge to be resolved is the inferior competitiveness of hydrogen production (catalysed by HydA) with NADPH production (catalysed by ferredoxin-NADP+-reductase (FNR)), which is essential for cell growth and takes up ~95% of photosynthetic electrons. In this work, the in vivo hydrogen production efficiency of mutants with ferredoxin-hydrogenase (Fd*-HydA1*) fusion protein construct, where the electron donor ferredoxin (Fd*) is fused to HydA1* and expressed in the model organism C. reinhardtii was investigated. Once Fd*-HydA1* fusion gene is expressed in algal cells, the fusion enzyme is able to draw the redistributed photosynthetic electrons and use them for efficient hydrogen production. From preliminary data, mutants with Fd*-HydA1* transgene showed a ~2-fold increase in the photosynthetic hydrogen production rate compared with its parental strain, which only possesses the native HydA in vivo. Therefore, a solid method of having more efficient hydrogen production in microalgae can be achieved through the expression of the synthetic enzymes.

Keywords: Chlamydomonas reinhardtii, ferredoxin, fusion protein, hydrogen production, hydrogenase

Procedia PDF Downloads 244
16429 High Temperature Deformation Behavior of Al0.2CoCrFeNiMo0.5 High Entropy alloy

Authors: Yasam Palguna, Rajesh Korla

Abstract:

The efficiency of thermally operated systems can be improved by increasing the operating temperature, thereby decreasing the fuel consumption and carbon footprint. Hence, there is a continuous need for replacing the existing materials with new alloys with higher temperature working capabilities. During the last decade, multi principal element alloys, commonly known as high entropy alloys are getting more attention because of their superior high temperature strength along with good high temperature corrosion and oxidation resistance, The present work focused on the microstructure and high temperature tensile behavior of Al0.2CoCrFeNiMo0.5 high entropy alloy (HEA). Wrought Al0.2CoCrFeNiMo0.5 high entropy alloy, produced by vacuum induction melting followed by thermomechanical processing, is tested in the temperature range of 200 to 900oC. It is exhibiting very good resistance to softening with increasing temperature up to 700oC, and thereafter there is a rapid decrease in the strength, especially beyond 800oC, which may be due to simultaneous occurrence of recrystallization and precipitate coarsening. Further, it is exhibiting superplastic kind of behavior with a uniform elongation of ~ 275 % at 900 oC temperature and 1 x 10-3 s-1 strain rate, which may be due to the presence of fine stable equi-axed grains. Strain rate sensitivity of 0.3 was observed, suggesting that solute drag dislocation glide might be the active mechanism during superplastic kind of deformation. Post deformation microstructure suggesting that cavitation at the sigma phase-matrix interface is the failure mechanism during high temperature deformation. Finally, high temperature properties of the present alloy will be compared with the contemporary high temperature materials such as ferritic, austenitic steels, and superalloys.

Keywords: high entropy alloy, high temperature deformation, super plasticity, post-deformation microstructures

Procedia PDF Downloads 150
16428 Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: assessment, hard skills, soft skills, standardized tests

Procedia PDF Downloads 270
16427 Indoor Air Assessment and Health Risk of Volatile Organic Compounds in Secondary School Classrooms in Benin City, Edo State, Nigeria

Authors: Osayomwanbor E. Oghama, John O. Olomukoro

Abstract:

The school environment, apart from home, is probably the most important indoor environment for children. Children spend as much as 80-90% of their indoor time either at school or at home; an average of 35 - 40 hours per week in schools, hence are at the risk of indoor air pollutants such as volatile organic compounds (VOCs). Concentrations of VOCs vary widely but are generally higher indoors than outdoors. This research was, therefore, carried out to evaluate the levels of VOCs in secondary school classrooms in Benin City, Edo State. Samples were obtained from a total of 18 classrooms in 6 secondary schools. Samples were collected 3 times from each school and from 3 different classrooms in each school using Draeger ORSA 5 tubes. Samplers were left to stay for a school-week (5 days). The VOCs detected and analyzed were benzene, ethlybenzene, isopropylbenzene, naphthalene, n-butylbenzene, n-propylbenzene, toluene, m-xylene, p-xylene, o-xylene, styrene, chlorobenzene, chloroform, 1,2-dichloropropane, 2,2-dichloropropane, tetrachloroethane, tetrahydrofuran, isopropyl acetate, α-pinene, and camphene. The results showed that chloroform, o-xylene, and styrene were the most abundant while α-pinene and camphene were the least abundant. The health risk assessment was done in terms of carcinogenic (CRI) and non-carcinogenic risks (THR). The CRI values of the schools ranged from 1.03 × 10-5 to 1.36 × 10-5 μg/m³ (a mean of 1.16 × 10-5 μg/m³) with School 6 and School 3 having the highest and lowest values respectively. The THR values of the study schools ranged from 0.071-0.086 μg/m³ (a mean of 0.078 μg/m³) with School 3 and School 2 having the highest and lowest values respectively. The results show that all the schools pose a potential carcinogenic risks having CRI values greater than the recommended limit of 1 × 10-6 µg/m³ and no non-carcinogenic risk having THR values less than the USEPA hazard quotient of 1 µg/m³. It is recommended that school authorities should ensure adequate ventilation in their schools, supplementing natural ventilation with mechanical sources, where necessary. In addition, indoor air quality should be taken into consideration in the design and construction of classrooms.

Keywords: carcinogenic risk indicator, health risk, indoor air, non-carcinogenic risk indicator, secondary schools, volatile organic compounds

Procedia PDF Downloads 174
16426 Essay on Theoretical Modeling of the Wealth Effect of Sukuk

Authors: Jamel Boukhatem, Mouldi Djelassi

Abstract:

Contrary to the existing literature generally focusing on the role played by Sukuk in enhancing investors' and shareholders' wealth, this paper sheds some light on the Sukuk wealth effect across all economic agents: households, government, and investors by implementing a two-period life-cycle model with overlapping generations to show whether Sukuk is net wealth. The main findings are threefold: i) the effect of a change in Sukuk issuances on the consumers’ utility level will be different from one generation to another, ii) an increase in taxes due to the increase in Sukuk and rents is covered by transfers made by the members of generation 1 in the form of inheritance, and iii) the existence of a positive relationship between the asset prices representative of Sukuk and the real activity.

Keywords: Sukuk, households, investors, overlapping generations model, wealth, modeling

Procedia PDF Downloads 63
16425 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator

Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani

Abstract:

During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).

Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA

Procedia PDF Downloads 175