Search results for: restructuring digital factory model
17109 Time/Temperature-Dependent Finite Element Model of Laminated Glass Beams
Authors: Alena Zemanová, Jan Zeman, Michal Šejnoha
Abstract:
The polymer foil used for manufacturing of laminated glass members behaves in a viscoelastic manner with temperature dependence. This contribution aims at incorporating the time/temperature-dependent behavior of interlayer to our earlier elastic finite element model for laminated glass beams. The model is based on a refined beam theory: each layer behaves according to the finite-strain shear deformable formulation by Reissner and the adjacent layers are connected via the Lagrange multipliers ensuring the inter-layer compatibility of a laminated unit. The time/temperature-dependent behavior of the interlayer is accounted for by the generalized Maxwell model and by the time-temperature superposition principle due to the Williams, Landel, and Ferry. The resulting system is solved by the Newton method with consistent linearization and the viscoelastic response is determined incrementally by the exponential algorithm. By comparing the model predictions against available experimental data, we demonstrate that the proposed formulation is reliable and accurately reproduces the behavior of the laminated glass units.Keywords: finite element method, finite-strain Reissner model, Lagrange multipliers, generalized Maxwell model, laminated glass, Newton method, Williams-Landel-Ferry equation
Procedia PDF Downloads 43117108 Model Predictive Control Using Thermal Inputs for Crystal Growth Dynamics
Authors: Takashi Shimizu, Tomoaki Hashimoto
Abstract:
Recently, crystal growth technologies have made progress by the requirement for the high quality of crystal materials. To control the crystal growth dynamics actively by external forces is useuful for reducing composition non-uniformity. In this study, a control method based on model predictive control using thermal inputs is proposed for crystal growth dynamics of semiconductor materials. The control system of crystal growth dynamics considered here is governed by the continuity, momentum, energy, and mass transport equations. To establish the control method for such thermal fluid systems, we adopt model predictive control known as a kind of optimal feedback control in which the control performance over a finite future is optimized with a performance index that has a moving initial time and terminal time. The objective of this study is to establish a model predictive control method for crystal growth dynamics of semiconductor materials.Keywords: model predictive control, optimal control, process control, crystal growth
Procedia PDF Downloads 35917107 Improving Ride Comfort of a Bus Using Fuzzy Logic Controlled Suspension
Authors: Mujde Turkkan, Nurkan Yagiz
Abstract:
In this study an active controller is presented for vibration suppression of a full-bus model. The bus is modelled having seven degrees of freedom. Using the achieved model via Lagrange Equations the system equations of motion are derived. The suspensions of the bus model include air springs with two auxiliary chambers are used. Fuzzy logic controller is used to improve the ride comfort. The numerical results, verifies that the presented fuzzy logic controller improves the ride comfort.Keywords: ride comfort, air spring, bus, fuzzy logic controller
Procedia PDF Downloads 43017106 Selection of Variogram Model for Environmental Variables
Authors: Sheikh Samsuzzhan Alam
Abstract:
The present study investigates the selection of variogram model in analyzing spatial variations of environmental variables with the trend. Sometimes, the autofitted theoretical variogram does not really capture the true nature of the empirical semivariogram. So proper exploration and analysis are needed to select the best variogram model. For this study, an open source data collected from California Soil Resource Lab1 is used to explain the problems when fitting a theoretical variogram. Five most commonly used variogram models: Linear, Gaussian, Exponential, Matern, and Spherical were fitted to the experimental semivariogram. Ordinary kriging methods were considered to evaluate the accuracy of the selected variograms through cross-validation. This study is beneficial for selecting an appropriate theoretical variogram model for environmental variables.Keywords: anisotropy, cross-validation, environmental variables, kriging, variogram models
Procedia PDF Downloads 33417105 Mathematical Modeling of Activated Sludge Process: Identification and Optimization of Key Design Parameters
Authors: Ujwal Kishor Zore, Shankar Balajirao Kausley, Aniruddha Bhalchandra Pandit
Abstract:
There are some important design parameters of activated sludge process (ASP) for wastewater treatment and they must be optimally defined to have the optimized plant working. To know them, developing a mathematical model is a way out as it is nearly commensurate the real world works. In this study, a mathematical model was developed for ASP, solved under activated sludge model no 1 (ASM 1) conditions and MATLAB tool was used to solve the mathematical equations. For its real-life validation, the developed model was tested for the inputs from the municipal wastewater treatment plant and the results were quite promising. Additionally, the most cardinal assumptions required to design the treatment plant are discussed in this paper. With the need for computerization and digitalization surging in every aspect of engineering, this mathematical model developed might prove to be a boon to many biological wastewater treatment plants as now they can in no time know the design parameters which are required for a particular type of wastewater treatment.Keywords: waste water treatment, activated sludge process, mathematical modeling, optimization
Procedia PDF Downloads 14417104 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 11117103 Conceptual Model for Knowledge Sharing Model in Creating Idea for Mobile Application
Authors: Hanafizan Hussain
Abstract:
This study shows that several projects will be conducted at the workshop in which using the conceptual model for knowledge sharing approach to create an idea for mobile application. The sharing idea has been done through the collaborative activity in which a group of different field sought to define the mobile application which will lead to new media approach of using social media platform. The collaborative activity will be provided and implemented in the form of one day workshop to determine the approach towards the theme given. The activity later will be continued for four weeks for the participant to prepare for the pitch day workshop. This paper shows the pitch of idea including the interface and prototype for the said products. The collaboration between the members with different field of study shows that social media influenced the knowledge sharing model and its creation or innovations. One of the projects supported a collaborative activity in which a group of young designers sought to define the knowledge sharing model of their ability in creating idea for mobile applications.Keywords: mobile application, collaborative activity, conceptual knowledge sharing model, social media platform
Procedia PDF Downloads 14317102 The Social Enterprise Model And Its Beneficiaries
Authors: Lorryn Williams
Abstract:
This study will explore how the introduction of the for-profit social enterprise model affects the real lives of the individuals and communities that this model aims to help in South Africa. The congruence between organisational need construction and the real needs of beneficiaries, and whether the adoption of a profit driven model, such as social entrepreneurship, supports or discards these needs is key to answering the former question. By making use of qualitative methods, the study aims to collect empirical evidence that either supports the social entrepreneurship approach when compared to other programs such as vocational training programs or rejects it as less beneficial. It is the objective of this research to provide an answer to the question of whether the social enterprise model of conducting charity leaves the beneficiaries of non-profit organisations in a generally better or worse off position. The study will specifically explore the underlying assumptions the social entrepreneurship model makes, since the assumptions made concerning the uplifting effects it has on its beneficiaries may produce either real or assumed change for beneficiaries. The meaning of social cohesion and social capital for these organisations, the construction of beneficiary dependence and independence, the consideration of formal and informal economies beneficiaries engage in, and the extent to which sustainability is used as a brand, will be investigated. Through engaging the relevant literature, experts in the field of non-profit donorship and need implementation, organisations who have both adopted social enterprise programs and not, and most importantly, the beneficiaries themselves, it will be possible to provide answers to questions this study aims to answer.Keywords: social enterprise, beneficiaries, profit driven model, non-profit organizations
Procedia PDF Downloads 14017101 On Four Models of a Three Server Queue with Optional Server Vacations
Authors: Kailash C. Madan
Abstract:
We study four models of a three server queueing system with Bernoulli schedule optional server vacations. Customers arriving at the system one by one in a Poisson process are provided identical exponential service by three parallel servers according to a first-come, first served queue discipline. In model A, all three servers may be allowed a vacation at one time, in Model B at the most two of the three servers may be allowed a vacation at one time, in model C at the most one server is allowed a vacation, and in model D no server is allowed a vacation. We study steady the state behavior of the four models and obtain steady state probability generating functions for the queue size at a random point of time for all states of the system. In model D, a known result for a three server queueing system without server vacations is derived.Keywords: a three server queue, Bernoulli schedule server vacations, queue size distribution at a random epoch, steady state
Procedia PDF Downloads 29617100 A Novel Algorithm for Parsing IFC Models
Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai
Abstract:
Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD). Procedia PDF Downloads 30017099 Digital Image Correlation: Metrological Characterization in Mechanical Analysis
Authors: D. Signore, M. Ferraiuolo, P. Caramuta, O. Petrella, C. Toscano
Abstract:
The Digital Image Correlation (DIC) is a newly developed optical technique that is spreading in all engineering sectors because it allows the non-destructive estimation of the entire surface deformation without any contact with the component under analysis. These characteristics make the DIC very appealing in all the cases the global deformation state is to be known without using strain gages, which are the most used measuring device. The DIC is applicable to any material subjected to distortion caused by either thermal or mechanical load, allowing to obtain high-definition mapping of displacements and deformations. That is why in the civil and the transportation industry, DIC is very useful for studying the behavior of metallic materials as well as of composite materials. DIC is also used in the medical field for the characterization of the local strain field of the vascular tissues surface subjected to uniaxial tensile loading. DIC can be carried out in the two dimension mode (2D DIC) if a single camera is used or in a three dimension mode (3D DIC) if two cameras are involved. Each point of the test surface framed by the cameras can be associated with a specific pixel of the image, and the coordinates of each point are calculated knowing the relative distance between the two cameras together with their orientation. In both arrangements, when a component is subjected to a load, several images related to different deformation states can be are acquired through the cameras. A specific software analyzes the images via the mutual correlation between the reference image (obtained without any applied load) and those acquired during the deformation giving the relative displacements. In this paper, a metrological characterization of the digital image correlation is performed on aluminum and composite targets both in static and dynamic loading conditions by comparison between DIC and strain gauges measures. In the static test, interesting results have been obtained thanks to an excellent agreement between the two measuring techniques. In addition, the deformation detected by the DIC is compliant with the result of a FEM simulation. In the dynamic test, the DIC was able to follow with a good accuracy the periodic deformation of the specimen giving results coherent with the ones given by FEM simulation. In both situations, it was seen that the DIC measurement accuracy depends on several parameters such as the optical focusing, the parameters chosen to perform the mutual correlation between the images and, finally, the reference points on image to be analyzed. In the future, the influence of these parameters will be studied, and a method to increase the accuracy of the measurements will be developed in accordance with the requirements of the industries especially of the aerospace one.Keywords: accuracy, deformation, image correlation, mechanical analysis
Procedia PDF Downloads 31117098 Long- and Short-Term Impacts of COVID-19 and Gold Price on Price Volatility: A Comparative Study of MIDAS and GARCH-MIDAS Models for USA Crude Oil
Authors: Samir K. Safi
Abstract:
The purpose of this study was to compare the performance of two types of models, namely MIDAS and MIDAS-GARCH, in predicting the volatility of crude oil returns based on gold price returns and the COVID-19 pandemic. The study aimed to identify which model would provide more accurate short-term and long-term predictions and which model would perform better in handling the increased volatility caused by the pandemic. The findings of the study revealed that the MIDAS model performed better in predicting short-term and long-term volatility before the pandemic, while the MIDAS-GARCH model performed significantly better in handling the increased volatility caused by the pandemic. The study highlights the importance of selecting appropriate models to handle the complexities of real-world data and shows that the choice of model can significantly impact the accuracy of predictions. The practical implications of model selection and exploring potential methodological adjustments for future research will be highlighted and discussed.Keywords: GARCH-MIDAS, MIDAS, crude oil, gold, COVID-19, volatility
Procedia PDF Downloads 6517097 Defining Processes of Gender Restructuring: The Case of Displaced Tribal Communities of North East India
Authors: Bitopi Dutta
Abstract:
Development Induced Displacement (DID) of subaltern groups has been an issue of intense debate in India. This research will do a gender analysis of displacement induced by the mining projects in tribal indigenous societies of North East India, centering on the primary research question which is 'How does DID reorder gendered relationship in tribal matrilineal societies?' This paper will not focus primarily on the impacts of the displacement induced by coal mining on indigenous tribal women in the North East India; it will rather study 'what' are the processes that lead to these transformations and 'how' do they operate. In doing so, the paper will locate the cracks in traditional social systems that the discourse of displacement manipulates for its own benefit. DID in this sense will not only be understood as only physical displacement, but also as social and cultural displacement. The study will cover one matrilineal tribe in the state of Meghalaya in the North East India affected by several coal mining projects in the last 30 years. In-depth unstructured interviews used to collect life narratives will be the primary mode of data collection because the indigenous culture of the tribes in Meghalaya, including the matrilineal tribes, is based on oral history where knowledge and experiences produced under a tradition of oral history exist in a continuum. This is unlike modern societies which produce knowledge in a compartmentalized system. An interview guide designed around specific themes will be used rather than specific questions to ensure the flow of narratives from the interviewee. In addition to this, a number of focus groups will be held. The data collected through the life narrative will be supplemented and contextualized through documentary research using government data, and local media sources of the region.Keywords: displacement, gender-relations, matriliny, mining
Procedia PDF Downloads 19517096 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models
Authors: V. Mantey, N. Findlay, I. Maddox
Abstract:
The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.Keywords: building detection, disaster relief, mask-RCNN, satellite mapping
Procedia PDF Downloads 16917095 Problems and Challenges in Social Economic Research after COVID-19: The Case Study of Province Sindh
Authors: Waleed Baloch
Abstract:
This paper investigates the problems and challenges in social-economic research in the case study of the province of Sindh after the COVID-19 pandemic; the pandemic has significantly impacted various aspects of society and the economy, necessitating a thorough examination of the resulting implications. The study also investigates potential strategies and solutions to mitigate these challenges, ensuring the continuation of robust social and economic research in the region. Through an in-depth analysis of data and interviews with key stakeholders, the study reveals several significant findings. Firstly, researchers encountered difficulties in accessing primary data due to disruptions caused by the pandemic, leading to limitations in the scope and accuracy of their studies. Secondly, the study highlights the challenges faced in conducting fieldwork, such as restrictions on travel and face-to-face interactions, which impacted the ability to gather reliable data. Lastly, the research identifies the need for innovative research methodologies and digital tools to adapt to the new research landscape brought about by the pandemic. The study concludes by proposing recommendations to address these challenges, including utilizing remote data collection methods, leveraging digital technologies for data analysis, and establishing collaborations among researchers to overcome resource constraints. By addressing these issues, researchers in the social economic field can effectively navigate the post-COVID-19 research landscape, facilitating a deeper understanding of the socioeconomic impacts and facilitating evidence-based policy interventions.Keywords: social economic, sociology, developing economies, COVID-19
Procedia PDF Downloads 6317094 Genetic Algorithm for Bi-Objective Hub Covering Problem
Authors: Abbas Mirakhorli
Abstract:
A hub covering problem is a type of hub location problem that tries to maximize the coverage area with the least amount of installed hubs. There have not been many studies in the literature about multi-objective hubs covering location problems. Thus, in this paper, a bi-objective model for the hub covering problem is presented. The two objectives that are considered in this paper are the minimization of total transportation costs and the maximization of coverage of origin-destination nodes. A genetic algorithm is presented to solve the model when the number of nodes is increased. The genetic algorithm is capable of solving the model when the number of nodes increases by more than 20. Moreover, the genetic algorithm solves the model in less amount of time.Keywords: facility location, hub covering, multi-objective optimization, genetic algorithm
Procedia PDF Downloads 6017093 3D Liver Segmentation from CT Images Using a Level Set Method Based on a Shape and Intensity Distribution Prior
Authors: Nuseiba M. Altarawneh, Suhuai Luo, Brian Regan, Guijin Tang
Abstract:
Liver segmentation from medical images poses more challenges than analogous segmentations of other organs. This contribution introduces a liver segmentation method from a series of computer tomography images. Overall, we present a novel method for segmenting liver by coupling density matching with shape priors. Density matching signifies a tracking method which operates via maximizing the Bhattacharyya similarity measure between the photometric distribution from an estimated image region and a model photometric distribution. Density matching controls the direction of the evolution process and slows down the evolving contour in regions with weak edges. The shape prior improves the robustness of density matching and discourages the evolving contour from exceeding liver’s boundaries at regions with weak boundaries. The model is implemented using a modified distance regularized level set (DRLS) model. The experimental results show that the method achieves a satisfactory result. By comparing with the original DRLS model, it is evident that the proposed model herein is more effective in addressing the over segmentation problem. Finally, we gauge our performance of our model against matrices comprising of accuracy, sensitivity and specificity.Keywords: Bhattacharyya distance, distance regularized level set (DRLS) model, liver segmentation, level set method
Procedia PDF Downloads 31317092 Developing Alternative Recovery Technology of Waste Heat in Automobile Factory
Authors: Kun-Ping Cheng, Dong-Shang Chang, Rou-Wen Wang
Abstract:
Pre-treatment of automobile paint-shop procedures are the preparation of warm water rinsing tank, hot water rinsing tank, degreasing tank, phosphate tank. The conventional boiler steam fuel is natural gas, producing steam to supply the heat exchange of each tank sink. In this study, the high-frequency soldering economizer is developed for recovering waste heat in the automotive paint-shop (RTO, Regenerative Thermal Oxidation). The heat recovery rate of the new economizer is 20% to 30% higher than the conventional embedded heat pipe. The adaptive control system responded to both RTO furnace exhaust gas and heat demands. In order to maintain the temperature range of the tanks, pre-treatment tanks are directly heated by waste heat recovery device (gas-to-water heat exchanger) through the hot water cycle of heat transfer. The performance of developed waste heat recovery system shows the annual recovery achieved to 1,226,411,483 Kcal of heat (137.8 thousand cubic meters of natural gas). Boiler can reduce fuel consumption by 20 to 30 percent compared to without waste heat recovery. In order to alleviate environmental impacts, the temperature at the end of the flue is further reduced from 160 to 110°C. The innovative waste heat recovery is helpful to energy savings and sustainable environment.Keywords: waste heat recovery system, sustainability, RTO (Regenerative Thermal Oxidation), economizer, automotive industry
Procedia PDF Downloads 26217091 AM/E/c Queuing Hub Maximal Covering Location Model with Fuzzy Parameter
Authors: M. H. Fazel Zarandi, N. Moshahedi
Abstract:
The hub location problem appears in a variety of applications such as medical centers, firefighting facilities, cargo delivery systems and telecommunication network design. The location of service centers has a strong influence on the congestion at each of them, and, consequently, on the quality of service. This paper presents a fuzzy maximal hub covering location problem (FMCHLP) in which travel costs between any pair of nodes is considered as a fuzzy variable. In order to consider the quality of service, we model each hub as a queue. Arrival rate follows Poisson distribution and service rate follows Erlang distribution. In this paper, at first, a nonlinear mathematical programming model is presented. Then, we convert it to the linear one. We solved the linear model using GAMS software up to 25 nodes and for large sizes due to the complexity of hub covering location problems, and simulated annealing algorithm is developed to solve and test the model. Also, we used possibilistic c-means clustering method in order to find an initial solution.Keywords: fuzzy modeling, location, possibilistic clustering, queuing
Procedia PDF Downloads 39417090 The Study of Applying Models: House, Temple and School for Sufficiency Development to Participate in ASEAN Economic Community: A Case Study of Trimitra Temple (China Town) Bangkok, Thailand
Authors: Saowapa Phaithayawat
Abstract:
The purposes of this study are: 1) to study the impact of the 3-community-core model: House (H), Temple (T), and School (S) with the co-operation of official departments on community development to ASEAN economic community involvement, and 2) to study the procedures and extension of the model. The research which is a qualitative research based on formal and informal interviews. Local people in a community are observed. Group interview is also operated by executors and cooperators in the school in the community. In terms of social and cultural dimension, the 3-community-core model consisting of house, temple and school is the base of Thai cultures bringing about understanding, happiness and unity to the community. The result of this research is that the official departments in accompanied with this model developers cooperatively work together in the community to support such factors as budget, plan, activities. Moreover, the need of community, and the continual result to sustain the community are satisfied by the model implementation. In terms of the procedures of the model implementation, executors and co-operators can work, coordinate, think, and launch their public relation altogether. Concerning the model development, this enables the community to achieve its goal to prepare the community’s readiness for ASEAN Economic Community involvement.Keywords: ASEAN Economic Community, the applying models and sufficiency development, house, temple, school
Procedia PDF Downloads 31417089 Cybernetic Modeling of Growth Dynamics of Debaryomyces nepalensis NCYC 3413 and Xylitol Production in Batch Reactor
Authors: J. Sharon Mano Pappu, Sathyanarayana N. Gummadi
Abstract:
Growth of Debaryomyces nepalensis on mixed substrates in batch culture follows diauxic pattern of completely utilizing glucose during the first exponential growth phase, followed by an intermediate lag phase and a second exponential growth phase consuming xylose. The present study deals with the development of cybernetic mathematical model for prediction of xylitol production and yield. Production of xylitol from xylose in batch fermentation is investigated in the presence of glucose as the co-substrate. Different ratios of glucose and xylose concentrations are assessed to study the impact of multi substrate on production of xylitol in batch reactors. The parameters in the model equations were estimated from experimental observations using integral method. The model equations were solved simultaneously by numerical technique using MATLAB. The developed cybernetic model of xylose fermentation in the presence of a co-substrate can provide answers about how the ratio of glucose to xylose influences the yield and rate of production of xylitol. This model is expected to accurately predict the growth of microorganism on mixed substrate, duration of intermediate lag phase, consumption of substrate, production of xylitol. The model developed based on cybernetic modelling framework can be helpful to simulate the dynamic competition between the metabolic pathways.Keywords: co-substrate, cybernetic model, diauxic growth, xylose, xylitol
Procedia PDF Downloads 32817088 A Simulation Model and Parametric Study of Triple-Effect Desalination Plant
Authors: Maha BenHamad, Ali Snoussi, Ammar Ben Brahim
Abstract:
A steady-state analysis of triple-effect thermal vapor compressor desalination unit was performed. A mathematical model based on mass, salinity and energy balances is developed. The purpose of this paper is to develop a connection between process simulator and process optimizer in order to study the influence of several operating variables on the performance and the produced water cost of the unit. A MATLAB program is used to solve the model equations, and Aspen HYSYS is used to model the plant. The model validity is examined against a commercial plant and showed a good agreement between industrial data and simulations results. Results show that the pressures of the last effect and the compressed vapor have an important influence on the produced cost, and the increase of the difference temperature in the condenser decreases the specific heat area about 22%.Keywords: steady-state, triple effect, thermal vapor compressor, Matlab, Aspen Hysys
Procedia PDF Downloads 17217087 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies
Authors: Yuanjin Liu
Abstract:
Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model
Procedia PDF Downloads 7417086 Numerical Simulation of the Bond Behavior Between Concrete and Steel Reinforcing Bars in Specialty Concrete
Authors: Camille A. Issa, Omar Masri
Abstract:
In the study, the commercial finite element software Abaqus was used to develop a three-dimensional nonlinear finite element model capable of simulating the pull-out test of reinforcing bars from underwater concrete. The results of thirty-two pull-out tests that have different parameters were implemented in the software to study the effect of the concrete cover, the bar size, the use of stirrups, and the compressive strength of concrete. The interaction properties used in the model provided accurate results in comparison with the experimental bond-slip results, thus the model has successfully simulated the pull-out test. The results of the finite element model are used to better understand and visualize the distribution of stresses in each component of the model, and to study the effect of the various parameters used in this study including the role of the stirrups in preventing the stress from reaching to the sides of the specimens.Keywords: pull-out test, bond strength, underwater concrete, nonlinear finite element analysis, abaqus
Procedia PDF Downloads 44217085 The Effect of 12-Week Pilates Training on Flexibility and Level of Perceived Exertion of Back Muscles among Karate Players
Authors: Seyedeh Nahal Sadiri, Ardalan Shariat
Abstract:
Developing flexibility, by using pilates, would be useful for karate players by reducing the stiffness of muscles and tendons. This study aimed to determine the effects of 12-week pilates training on flexibility, and level of perceived exertion of back muscles among karate players. In this experimental study, 29 male karate players (age: 16-18 years) were randomized to pilates (n=15), and control (n=14) groups and the assessments were done in baseline and after 12-week intervention. Both groups completed 12-week of intervention (2 hours of training, 3 times weekly). The experimental group performed 30 minutes pilates within their warm-up and preparation phase, where the control group only attended their usual karate training. Digital backward flexmeter was used to evaluate the trunk extensors flexibility, and digital forward flexmeter was used to measure the trunk flexors flexibility. Borg CR-10 Scale was also used to determine the perceived exertion of back muscles. Independent samples t-test and paired sample t-test were used to analyze the data. There was a significant difference between the mean score of experimental and control groups in the level of backward trunk flexibility (P < 0.05), forward trunk flexibility (P < 0.05) after 12-week intervention. The results of Borg CR-10 scale showed a significant improvement in pilates group (P < 0.05). Karate instructors, coaches, and athletes can integrate pilates exercises with karate training in order to improve the flexibility, and level of perceived exertion of back muscles.Keywords: pilates training, karate players, flexibility, Borg CR-10
Procedia PDF Downloads 16517084 Revolutionizing Oil Palm Replanting: Geospatial Terrace Design for High-precision Ground Implementation Compared to Conventional Methods
Authors: Nursuhaili Najwa Masrol, Nur Hafizah Mohammed, Nur Nadhirah Rusyda Rosnan, Vijaya Subramaniam, Sim Choon Cheak
Abstract:
Replanting in oil palm cultivation is vital to enable the introduction of planting materials and provides an opportunity to improve the road, drainage, terrace design, and planting density. Oil palm replanting is fundamentally necessary every 25 years. The adoption of the digital replanting blueprint is imperative as it can assist the Malaysia Oil Palm industry in addressing challenges such as labour shortages and limited expertise related to replanting tasks. Effective replanting planning should commence at least 6 months prior to the actual replanting process. Therefore, this study will help to plan and design the replanting blueprint with high-precision translation on the ground. With the advancement of geospatial technology, it is now feasible to engage in thoroughly researched planning, which can help maximize the potential yield. A blueprint designed before replanting is to enhance management’s ability to optimize the planting program, address manpower issues, or even increase productivity. In terrace planting blueprints, geographic tools have been utilized to design the roads, drainages, terraces, and planting points based on the ARM standards. These designs are mapped with location information and undergo statistical analysis. The geospatial approach is essential in precision agriculture and ensuring an accurate translation of design to the ground by implementing high-accuracy technologies. In this study, geospatial and remote sensing technologies played a vital role. LiDAR data was employed to determine the Digital Elevation Model (DEM), enabling the precise selection of terraces, while ortho imagery was used for validation purposes. Throughout the designing process, Geographical Information System (GIS) tools were extensively utilized. To assess the design’s reliability on the ground compared with the current conventional method, high-precision GPS instruments like EOS Arrow Gold and HIPER VR GNSS were used, with both offering accuracy levels between 0.3 cm and 0.5cm. Nearest Distance Analysis was generated to compare the design with actual planting on the ground. The analysis revealed that it could not be applied to the roads due to discrepancies between actual roads and the blueprint design, which resulted in minimal variance. In contrast, the terraces closely adhered to the GPS markings, with the most variance distance being less than 0.5 meters compared to actual terraces constructed. Considering the required slope degrees for terrace planting, which must be greater than 6 degrees, the study found that approximately 65% of the terracing was constructed at a 12-degree slope, while over 50% of the terracing was constructed at slopes exceeding the minimum degrees. Utilizing blueprint replanting promising strategies for optimizing land utilization in agriculture. This approach harnesses technology and meticulous planning to yield advantages, including increased efficiency, enhanced sustainability, and cost reduction. From this study, practical implementation of this technique can lead to tangible and significant improvements in agricultural sectors. In boosting further efficiencies, future initiatives will require more sophisticated techniques and the incorporation of precision GPS devices for upcoming blueprint replanting projects besides strategic progression aims to guarantee the precision of both blueprint design stages and its subsequent implementation on the field. Looking ahead, automating digital blueprints are necessary to reduce time, workforce, and costs in commercial production.Keywords: replanting, geospatial, precision agriculture, blueprint
Procedia PDF Downloads 8317083 Development of Star Image Simulator for Star Tracker Algorithm Validation
Authors: Zoubida Mahi
Abstract:
A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.Keywords: star tracker, star simulation, star detection, centroid, noise, scenario
Procedia PDF Downloads 9617082 Soil-Structure Interaction Models for the Reinforced Foundation System – A State-of-the-Art Review
Authors: Ashwini V. Chavan, Sukhanand S. Bhosale
Abstract:
Challenges of weak soil subgrade are often resolved either by stabilization or reinforcing it. However, it is also practiced to reinforce the granular fill to improve the load-settlement behavior of over weak soil strata. The inclusion of reinforcement in the engineered granular fill provided a new impetus for the development of enhanced Soil-Structure Interaction (SSI) models, also known as mechanical foundation models or lumped parameter models. Several researchers have been working in this direction to understand the mechanism of granular fill-reinforcement interaction and the response of weak soil under the application of load. These models have been developed by extending available SSI models such as the Winkler Model, Pasternak Model, Hetenyi Model, Kerr Model etc., and are helpful to visualize the load-settlement behavior of a physical system through 1-D and 2-D analysis considering beam and plate resting on the foundation respectively. Based on the literature survey, these models are categorized as ‘Reinforced Pasternak Model,’ ‘Double Beam Model,’ ‘Reinforced Timoshenko Beam Model,’ and ‘Reinforced Kerr Model.’ The present work reviews the past 30+ years of research in the field of SSI models for reinforced foundation systems, presenting the conceptual development of these models systematically and discussing their limitations. Special efforts are taken to tabulate the parameters and their significance in the load-settlement analysis, which may be helpful in future studies for the comparison and enhancement of results and findings of physical models.Keywords: geosynthetics, mathematical modeling, reinforced foundation, soil-structure interaction, ground improvement, soft soil
Procedia PDF Downloads 12317081 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method
Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi
Abstract:
This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method
Procedia PDF Downloads 39417080 Concurrent Validity of Synchronous Tele-Audiology Hearing Screening
Authors: Thidilweli Denga, Bessie Malila, Lucretia Petersen
Abstract:
The Coronavirus Disease of 2019 (COVID-19) pandemic should be taken as a wake-up call on the importance of hearing health care considering amongst other things the electronic methods of communication used. The World Health Organization (WHO) estimated that by 2050, there will be more than 2.5 billion people living with hearing loss. These numbers show that more people will need rehabilitation services. Studies have shown that most people living with hearing loss reside in Low-Middle Income Countries (LIMC). Innovative technological solutions such as digital health interventions that can be used to deliver hearing health services to remote areas now exist. Tele-audiology implementation can potentially enable the delivery of hearing loss services to rural and remote areas. This study aimed to establish the concurrent validity of the tele-audiology practice in school-based hearing screening. The study employed a cross-sectional design with a within-group comparison. The portable KUDUwave Audiometer was used to conduct hearing screening from 50 participants (n=50). In phase I of the study, the audiologist conducted on-site hearing screening, while the synchronous remote hearing screening (tele-audiology) using a 5G network was done in phase II. On-site hearing screening results were obtained for the first 25 participants (aged between 5-6 years). The second half started with the synchronous tele-audiology model to avoid order-effect. Repeated sample t-tests compared threshold results obtained in the left and right ears for onsite and remote screening. There was a good correspondence between the two methods with a threshold average within ±5 dB (decibels). The synchronous tele-audiology model has the potential to reduce the audiologists' case overload, while at the same time reaching populations that lack access due to distance, and shortage of hearing professionals in their areas of reach. With reliable and broadband connectivity, tele-audiology delivers the same service quality as the conventional method while reducing the travel costs of audiologists.Keywords: hearing screening, low-resource communities, portable audiometer, tele-audiology
Procedia PDF Downloads 116