Search results for: autoregressive integrate moving average model selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23013

Search results for: autoregressive integrate moving average model selection

16473 Fully Autonomous Vertical Farm to Increase Crop Production

Authors: Simone Cinquemani, Lorenzo Mantovani, Aleksander Dabek

Abstract:

New technologies in agriculture are opening new challenges and new opportunities. Among these, certainly, robotics, vision, and artificial intelligence are the ones that will make a significant leap, compared to traditional agricultural techniques, possible. In particular, the indoor farming sector will be the one that will benefit the most from these solutions. Vertical farming is a new field of research where mechanical engineering can bring knowledge and know-how to transform a highly labor-based business into a fully autonomous system. The aim of the research is to develop a multi-purpose, modular, and perfectly integrated platform for crop production in indoor vertical farming. Activities will be based both on hardware development such as automatic tools to perform different activities on soil and plants, as well as research to introduce an extensive use of monitoring techniques based on machine learning algorithms. This paper presents the preliminary results of a research project of a vertical farm living lab designed to (i) develop and test vertical farming cultivation practices, (ii) introduce a very high degree of mechanization and automation that makes all processes replicable, fully measurable, standardized and automated, (iii) develop a coordinated control and management environment for autonomous multiplatform or tele-operated robots in environments with the aim of carrying out complex tasks in the presence of environmental and cultivation constraints, (iv) integrate AI-based algorithms as decision support system to improve quality production. The coordinated management of multiplatform systems still presents innumerable challenges that require a strongly multidisciplinary approach right from the design, development, and implementation phases. The methodology is based on (i) the development of models capable of describing the dynamics of the various platforms and their interactions, (ii) the integrated design of mechatronic systems able to respond to the needs of the context and to exploit the strength characteristics highlighted by the models, (iii) implementation and experimental tests performed to test the real effectiveness of the systems created, evaluate any weaknesses so as to proceed with a targeted development. To these aims, a fully automated laboratory for growing plants in vertical farming has been developed and tested. The living lab makes extensive use of sensors to determine the overall state of the structure, crops, and systems used. The possibility of having specific measurements for each element involved in the cultivation process makes it possible to evaluate the effects of each variable of interest and allows for the creation of a robust model of the system as a whole. The automation of the laboratory is completed with the use of robots to carry out all the necessary operations, from sowing to handling to harvesting. These systems work synergistically thanks to the knowledge of detailed models developed based on the information collected, which allows for deepening the knowledge of these types of crops and guarantees the possibility of tracing every action performed on each single plant. To this end, artificial intelligence algorithms have been developed to allow synergistic operation of all systems.

Keywords: automation, vertical farming, robot, artificial intelligence, vision, control

Procedia PDF Downloads 21
16472 Fiber Based Pushover Analysis of Reinforced Concrete Frame

Authors: Shewangizaw Tesfaye Wolde

Abstract:

The current engineering community has developed a method called performance based seismic design in which we design structures based on predefined performance levels set by the parties. Since we design our structures economically for the maximum actions expected in the life of structures they go beyond their elastic limit, in need of nonlinear analysis. In this paper conventional pushover analysis (nonlinear static analysis) is used for the performance assessment of the case study Reinforced Concrete (RC) Frame building located in Addis Ababa City, Ethiopia where proposed peak ground acceleration value by RADIUS 1999 project and others is more than twice as of EBCS-8:1995 (RADIUS 1999 project) by taking critical planar frame. Fiber beam-column model is used to control material nonlinearity with tension stiffening effect. The reliability of the fiber model and validation of software outputs are checked under verification chapter. Therefore, the aim of this paper is to propose a way for structural performance assessment of existing reinforced concrete frame buildings as well as design check.

Keywords: seismic, performance, fiber model, tension stiffening, reinforced concrete

Procedia PDF Downloads 58
16471 Study of the Polymer Elastic Behavior in the Displacement Oil Drops at Pore Scale

Authors: Luis Prada, Jose Gomez, Arlex Chaves, Julio Pedraza

Abstract:

Polymeric liquids have been used in the oil industry, especially at enhanced oil recovery (EOR). From the rheological point of view, polymers have the particularity of being viscoelastic liquids. One of the most common and useful models to describe that behavior is the Upper Convected Maxwell model (UCM). The main characteristic of the polymer used in EOR process is the increase in viscosity which pushes the oil outside of the reservoir. The elasticity could contribute in the drag of the oil that stays in the reservoir. Studying the elastic effect on the oil drop at the pore scale, bring an explanation if the addition of elastic force could mobilize the oil. This research explores if the contraction and expansion of the polymer in the pore scale may increase the elastic behavior of this kind of fluid. For that reason, this work simplified the pore geometry and build two simple geometries with micrometer lengths. Using source terms with the user define a function this work introduces the UCM model in the ANSYS fluent simulator with the purpose of evaluating the elastic effect of the polymer in a contraction and expansion geometry. Also, using the Eulerian multiphase model, this research considers the possibility that extra elastic force will show a deformation effect on the oil; for that reason, this work considers an oil drop on the upper wall of the geometry. Finally, all the simulations exhibit that at the pore scale conditions exist extra vortices at UCM model but is not possible to deform the oil completely and push it outside of the restrictions, also this research find the conditions for the oil displacement.

Keywords: ANSYS fluent, interfacial fluids mechanics, polymers, pore scale, viscoelasticity

Procedia PDF Downloads 119
16470 An Assessment on Socio-Economic Impacts of Smallholder Eucalyptus Tree Plantation in the Case of Northwest Ethiopia

Authors: Mersha Tewodros Getnet, Mengistu Ketema, Bamlaku Alemu, Girma Demilew

Abstract:

The availability of forest products determines the possibilities for forest-based livelihood options. Plantation forest is a widespread economic activity in highland areas of the Amhara regional state, owing primarily to degradation and limited access to natural forests. As a result, tree plantation has become one of the rural livelihood options in the area. Therefore, given the increasing importance of smallholder plantations in highland areas of Amhara Regional States, the aim of this research was to evaluate the extent of smallholder plantations and their socio-economic impact. To address the abovementioned research, a sequential embedded mixed research design was employed. This qualitative and quantitative information was gathered from both primary and secondary sources. Primary data were collected from 385 sample households, which were chosen using a three-stage, multi-stage sampling method based on the Cochran sample size formula. Both descriptive and inferential statistics were used to analyze the data. Smallholder eucalyptus plantations in the study area were discovered to be common, and they are now part of the livelihood portfolio for meeting both household wood consumption and generating cash income. According to the PSM model's ATT results, income from selling farm forest products certainly contributes more to total household income, farm expenditure per cultivated land, and education spending than non-planter households. As a result, the government must strengthen plantation practices by prioritizing specific intervention areas while implementing measures to counteract the plantation's inequality-increasing effect through a variety of means, including progressive taxation.

Keywords: smallholder plantation, Eucalyptus, propensity score matching, average treatment effect and income

Procedia PDF Downloads 116
16469 Control Algorithm for Home Automation Systems

Authors: Marek Długosz, Paweł Skruch

Abstract:

One of purposes of home automation systems is to provide appropriate comfort to the users by suitable air temperature control and stabilization inside the rooms. The control of temperature level is not a simple task and the basic difficulty results from the fact that accurate parameters of the object of control, that is a building, remain unknown. Whereas the structure of the model is known, the identification of model parameters is a difficult task. In this paper, a control algorithm allowing the present temperature to be reached inside the building within the specified time without the need to know accurate parameters of the building itself is presented.

Keywords: control, home automation system, wireless networking, automation engineering

Procedia PDF Downloads 599
16468 Integrating Explicit Instruction and Problem-Solving Approaches for Efficient Learning

Authors: Slava Kalyuga

Abstract:

There are two opposing major points of view on the optimal degree of initial instructional guidance that is usually discussed in the literature by the advocates of the corresponding learning approaches. Using unguided or minimally guided problem-solving tasks prior to explicit instruction has been suggested by productive failure and several other instructional theories, whereas an alternative approach - using fully guided worked examples followed by problem solving - has been demonstrated as the most effective strategy within the framework of cognitive load theory. An integrated approach discussed in this paper could combine the above frameworks within a broader theoretical perspective which would allow bringing together their best features and advantages in the design of learning tasks for STEM education. This paper represents a systematic review of the available empirical studies comparing the above alternative sequences of instructional methods to explore effects of several possible moderating factors. The paper concludes that different approaches and instructional sequences should coexist within complex learning environments. Selecting optimal sequences depends on such factors as specific goals of learner activities, types of knowledge to learn, levels of element interactivity (task complexity), and levels of learner prior knowledge. This paper offers an outline of a theoretical framework for the design of complex learning tasks in STEM education that would integrate explicit instruction and inquiry (exploratory, discovery) learning approaches in ways that depend on a set of defined specific factors.

Keywords: cognitive load, explicit instruction, exploratory learning, worked examples

Procedia PDF Downloads 112
16467 A Conceptual Model of the 'Driver – Highly Automated Vehicle' System

Authors: V. A. Dubovsky, V. V. Savchenko, A. A. Baryskevich

Abstract:

The current trend in the automotive industry towards automatic vehicles is creating new challenges related to human factors. This occurs due to the fact that the driver is increasingly relieved of the need to be constantly involved in driving the vehicle, which can negatively impact his/her situation awareness when manual control is required, and decrease driving skills and abilities. These new problems need to be studied in order to provide road safety during the transition towards self-driving vehicles. For this purpose, it is important to develop an appropriate conceptual model of the interaction between the driver and the automated vehicle, which could serve as a theoretical basis for the development of mathematical and simulation models to explore different aspects of driver behaviour in different road situations. Well-known driver behaviour models describe the impact of different stages of the driver's cognitive process on driving performance but do not describe how the driver controls and adjusts his actions. A more complete description of the driver's cognitive process, including the evaluation of the results of his/her actions, will make it possible to more accurately model various aspects of the human factor in different road situations. This paper presents a conceptual model of the 'driver – highly automated vehicle' system based on the P.K. Anokhin's theory of functional systems, which is a theoretical framework for describing internal processes in purposeful living systems based on such notions as goal, desired and actual results of the purposeful activity. A central feature of the proposed model is a dynamic coupling mechanism between the decision-making of a driver to perform a particular action and changes of road conditions due to driver’s actions. This mechanism is based on the stage by stage evaluation of the deviations of the actual values of the driver’s action results parameters from the expected values. The overall functional structure of the highly automated vehicle in the proposed model includes a driver/vehicle/environment state analyzer to coordinate the interaction between driver and vehicle. The proposed conceptual model can be used as a framework to investigate different aspects of human factors in transitions between automated and manual driving for future improvements in driving safety, and for understanding how driver-vehicle interface must be designed for comfort and safety. A major finding of this study is the demonstration that the theory of functional systems is promising and has the potential to describe the interaction of the driver with the vehicle and the environment.

Keywords: automated vehicle, driver behavior, human factors, human-machine system

Procedia PDF Downloads 128
16466 Liquid Unloading of Wells with Scaled Perforation via Batch Foamers

Authors: Erwin Chan, Aravind Subramaniyan, Siti Abdullah Fatehah, Steve Lian Kuling

Abstract:

Foam assisted lift technology is proven across the industry to provide efficient deliquification in gas wells. Such deliquification is typically achieved by delivering the foamer chemical downhole via capillary strings. In highly liquid loaded wells where capillary strings are not readily available, foamer can be delivered via batch injection or bull-heading. The latter techniques differ from the former in that cap strings allow for liquid to be unloaded continuously, whereas foamer batches require that periodic batching be conducted for the liquid to be unloaded. Although batch injection allows for liquid to be unloaded in wells with suitable water to gas (WGR) ratio and condensate to gas (CGR) ratio without well intervention for capillary string installation, this technique comes with its own set of challenges - for foamer to de-liquify liquids, the chemical needs to reach perforation locations where gas bubbling is observed. In highly scaled perforation zones in certain wells, foamer delivered in batches is unable to reach the gas bubbling zone, thus achieving poor lift efficiency. This paper aims to discuss the techniques and challenges for unloading liquid via batch injection in scaled perforation wells X and Y, whose WGR is 6bbl/MMscf, whose scale build-up is observed at the bottom of perforation interval, whose water column is 400 feet, and whose ‘bubbling zone’ is less than 100 feet. Variables such as foamer Z dosage, batching technique, and well flow control valve opening times are manipulated during the duration of the trial to achieve maximum liquid unloading and gas rates. During the field trial, the team has found optimal values between the three aforementioned parameters for best unloading results, in which each cycle’s gas and liquid rates are compared with baselines with similar flowing tubing head pressures (FTHP). It is discovered that amongst other factors, a good agitation technique is a primary determinant for efficient liquid unloading. An average increment of 2MMscf/d against an average production of 4MMscf/d at stable FTHP is recorded during the trial.

Keywords: foam, foamer, gas lift, liquid unloading, scale, batch injection

Procedia PDF Downloads 168
16465 Estimating the Traffic Impacts of Green Light Optimal Speed Advisory Systems Using Microsimulation

Authors: C. B. Masera, M. Imprialou, L. Budd, C. Morton

Abstract:

Even though signalised intersections are necessary for urban road traffic management, they can act as bottlenecks and disrupt traffic operations. Interrupted traffic flow causes congestion, delays, stop-and-go conditions (i.e. excessive acceleration/deceleration) and longer journey times. Vehicle and infrastructure connectivity offers the potential to provide improved new services with additional functions of assisting drivers. This paper focuses on one of the applications of vehicle-to-infrastructure communication namely Green Light Optimal Speed Advisory (GLOSA). To assess the effectiveness of GLOSA in the urban road network, an integrated microscopic traffic simulation framework is built into VISSIM software. Vehicle movements and vehicle-infrastructure communications are simulated through the interface of External Driver Model. A control algorithm is developed for recommending an optimal speed that is continuously updated in every time step for all vehicles approaching a signal-controlled point. This algorithm allows vehicles to pass a traffic signal without stopping or to minimise stopping times at a red phase. This study is performed with all connected vehicles at 100% penetration rate. Conventional vehicles are also simulated in the same network as a reference. A straight road segment composed of two opposite directions with two traffic lights per lane is studied. The simulation is implemented under 150 vehicles per hour and 200 per hour traffic volume conditions to identify how different traffic densities influence the benefits of GLOSA. The results indicate that traffic flow is improved by the application of GLOSA. According to this study, vehicles passed through the traffic lights more smoothly, and waiting times were reduced by up to 28 seconds. Average delays decreased for the entire network by 86.46% and 83.84% under traffic densities of 150 vehicles per hour per lane and 200 vehicles per hour per lane, respectively.

Keywords: connected vehicles, GLOSA, intelligent transport systems, vehicle-to-infrastructure communication

Procedia PDF Downloads 152
16464 Modeling of Strong Motion Generation Areas of the 2011 Tohoku, Japan Earthquake Using Modified Semi-Empirical Technique Incorporating Frequency Dependent Radiation Pattern Model

Authors: Sandeep, A. Joshi, Kamal, Piu Dhibar, Parveen Kumar

Abstract:

In the present work strong ground motion has been simulated using a modified semi-empirical technique (MSET), with frequency dependent radiation pattern model. Joshi et al. (2014) have modified the semi-empirical technique to incorporate the modeling of strong motion generation areas (SMGAs). A frequency dependent radiation pattern model is applied to simulate high frequency ground motion more precisely. Identified SMGAs (Kurahashi and Irikura 2012) of the 2011 Tohoku earthquake (Mw 9.0) were modeled using this modified technique. Records are simulated for both frequency dependent and constant radiation pattern function. Simulated records for both cases are compared with observed records in terms of peak ground acceleration and pseudo acceleration response spectra at different stations. Comparison of simulated and observed records in terms of root mean square error suggests that the method is capable of simulating record which matches in a wide frequency range for this earthquake and bears realistic appearance in terms of shape and strong motion parameters. The results confirm the efficacy and suitability of rupture model defined by five SMGAs for the developed modified technique.

Keywords: strong ground motion, semi-empirical, strong motion generation area, frequency dependent radiation pattern, 2011 Tohoku Earthquake

Procedia PDF Downloads 520
16463 Concentration Conditions of Industrially Valuable Accumulations of Gold Ore Mineralization of the Tulallar Ore-Bearing Structure

Authors: Narmina Ismayilova, Shamil Zabitov, Fuad Askerzadeh, Raqif Seyfullayev

Abstract:

Tulallar volcano-tectonic structure is located in the conjugation zone of the Gekgel horst-uplift, Dashkesan, and Agzhakend synclinorium. Regionally, these geological structures are an integral part of the Lok-Karabakh island arc system. Tulallar field is represented by three areas (Central, East, West). The area of the ore field is located within a partially eroded oblong volcano-tectonic depression. In the central part, the core is divided by the deep Tulallar-Chiragdara-Toganalinsky fault with arcuate fragments of the ring structure into three blocks -East, Central, and West, within which the same areas of the Tulallar field are located. In general, for the deposit, the position of both ore-bearing vein zones and ore-bearing blocks is controlled by fractures of two systems - sub-latitudinal and near-meridional orientations. Mineralization of gold-sulfide ores is confined to these zones of disturbances. The zones have a northwestern and northeastern (near-meridian) strike with a steep dip (70-85◦) to the southwest and southeast. The average thickness of the zones is 35 m; they are traced for 2.5 km along the strike and 500 m along with the dip. In general, for the indicated thickness, the zones contain an average of 1.56 ppm Au; however, areas enriched in noble metal are distinguished within them. The zones are complicated by postore fault tectonics. Gold mineralization is localized in the Kimmeridgian volcanics of andesi-basalt-porphyritic composition and their vitrolithoclastic, agglomerate tuffs, and tuff breccias. For the central part of the Tulallar ore field, a map of geochemical anomalies was built on the basis of analysis data carried out in an international laboratory. The total gold content ranges from 0.1-5 g/t, and in some places, even more than 5 g/t. The highest gold content is observed in the monoquartz facies among the secondary quartzites with quartz veins. The smallest amount of gold content appeared in the quartz-kaolin facies. And also, anomalous values of gold content are located in the upper part of the quartz vein. As a result, an en-echelon arrangement of anomalous values of gold along the strike and dip was revealed.

Keywords: geochemical anomaly, gold deposit, mineralization, Tulallar

Procedia PDF Downloads 178
16462 Biophysical Consideration in the Interaction of Biological Cell Membranes with Virus Nanofilaments

Authors: Samaneh Farokhirad, Fatemeh Ahmadpoor

Abstract:

Biological membranes are constantly in contact with various filamentous soft nanostructures that either reside on their surface or are being transported between the cell and its environment. In particular, viral infections are determined by the interaction of viruses (such as filovirus) with cell membranes, membrane protein organization (such as cytoskeletal proteins and actin filament bundles) has been proposed to influence the mechanical properties of lipid membranes, and the adhesion of filamentous nanoparticles influence their delivery yield into target cells or tissues. The goal of this research is to integrate the rapidly increasing but still fragmented experimental observations on the adhesion and self-assembly of nanofilaments (including filoviruses, actin filaments, as well as natural and synthetic nanofilaments) on cell membranes into a general, rigorous, and unified knowledge framework. The global outbreak of the coronavirus disease in 2020, which has persisted for over three years, highlights the crucial role that nanofilamentbased delivery systems play in human health. This work will unravel the role of a unique property of all cell membranes, namely flexoelectricity, and the significance of nanofilaments’ flexibility in the adhesion and self-assembly of nanofilaments on cell membranes. This will be achieved utilizing a set of continuum mechanics, statistical mechanics, and molecular dynamics and Monte Carlo simulations. The findings will help address the societal needs to understand biophysical principles that govern the attachment of filoviruses and flexible nanofilaments onto the living cells and provide guidance on the development of nanofilament-based vaccines for a range of diseases, including infectious diseases and cancer.

Keywords: virus nanofilaments, cell mechanics, computational biophysics, statistical mechanics

Procedia PDF Downloads 79
16461 Case Report: Opioid Sparing Anaesthesia with Dexmedetomidine in General Surgery

Authors: Shang Yee Chong

Abstract:

Perioperative pain is a complex mechanism activated by various nociceptive, neuropathic, and inflammatory pathways. Opioids have long been a mainstay for analgesia in this period, even as we are continuously moving towards a multimodal model to improve pain control while minimising side effects. Dexmedetomidine, a potent alpha-2 agonist, is a useful sedative and hypnotic agent. Its use in the intensive care unit has been well described, and it is increasingly an adjunct intraoperatively for its opioid sparing effects and to decrease pain scores. We describe a case of a general surgical patient in whom minimal opioids was required with dexmedetomidine use. The patient was a 61-year-old Indian gentleman with a history of hyperlipidaemia and type 2 diabetes mellitus, presenting with rectal adenocarcinoma detected on colonoscopy. He was scheduled for a robotic ultra-low anterior resection. The patient was induced with intravenous fentanyl 75mcg, propofol 160mg and atracurium 40mg. He was intubated conventionally and mechanically ventilated. Anaesthesia was maintained with inhalational desflurane and anaesthetic depth was measured with the Masimo EEG Sedline brain function monitor. An initial intravenous dexmedetomidine dose (bolus) of 1ug/kg for 10 minutes was given prior to anaesthetic induction and thereafter, an infusion of 0.2-0.4ug/kg/hr to the end of surgery. In addition, a bolus dose of intravenous lignocaine 1.5mg/kg followed by an infusion at 1mg/kg/hr throughout the surgery was administered. A total of 10mmol of magnesium sulphate and intravenous paracetamol 1000mg were also given for analgesia. There were no significant episodes of bradycardia or hypotension. A total of intravenous phenylephrine 650mcg was given throughout to maintain the patient’s mean arterial pressure within 10-15mmHg of baseline. The surgical time lasted for 5 hours and 40minutes. Postoperatively the patient was reversed and extubated successfully. He was alert and comfortable and pain scores were minimal in the immediate post op period in the postoperative recovery unit. Time to first analgesia was 4 hours postoperatively – with paracetamol 1g administered. This was given at 6 hourly intervals strictly for 5 days post surgery, along with celecoxib 200mg BD as prescribed by the surgeon regardless of pain scores. Oral oxycodone was prescribed as a rescue analgesic for pain scores > 3/10, but the patient did not require any dose. Neither was there nausea or vomiting. The patient was discharged on postoperative day 5. This case has reinforced the use of dexmedetomidine as an adjunct in general surgery cases, highlighting its excellent opioid-sparing effects. In the entire patient’s hospital stay, the only dose of opioid he received was 75mcg of fentanyl at the time of anaesthetic induction. The patient suffered no opioid adverse effects such as nausea, vomiting or postoperative ileus, and pain scores varied from 0-2/10. However, intravenous lignocaine infusion was also used in this instance, which would have helped improve pain scores. Paracetamol, lignocaine, and dexmedetomidine is thus an effective, opioid-sparing combination of multi-modal analgesia for major abdominal surgery cases.

Keywords: analgesia, dexmedetomidine, general surgery, opioid sparing

Procedia PDF Downloads 119
16460 Vehicular Speed Detection Camera System Using Video Stream

Authors: C. A. Anser Pasha

Abstract:

In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.

Keywords: radar, image processing, detection, tracking, segmentation

Procedia PDF Downloads 452
16459 A Relative Analysis of Carbon and Dust Uptake by Important Tree Species in Tehran, Iran

Authors: Sahar Elkaee Behjati

Abstract:

Air pollution, particularly with dust, is one of the biggest issues Tehran is dealing with, and the city's green space which consists of trees has a critical role in absorption of it. The question this study aimed to investigate was which tree species the highest uptake capacity of the dust and carbon have suspended in the air. On this basis, 30 samples of trees from two different districts in Tehran were collected, and after washing and centrifuging, the samples were oven dried. The results of the study revealed that Ulmus minor had the highest amount of deposited dust in both districts. In addition, it was found that in Chamran district Ailanthus altissima and in Gandi district Ulmus minor has had the highest absorption of deposited carbon. Therefore, it could be argued that decision making on the selection of species for urban green spaces should take the above-mentioned parameters into account.

Keywords: dust, leaves, uptake total carbon, Tehran, tree species

Procedia PDF Downloads 121
16458 An Analysis of the Effectiveness of Computer-Assisted Instruction on Student Achievement in Differing Science Content Areas

Authors: Edwin Christmann, John Hicks

Abstract:

This meta-analysis compared the mathematics achievement of students who received either traditional instruction or traditional instruction supplemented with computer-assisted instruction (CAI). From the 27 conclusions, an overall mean effect size of 0.236 was calculated, indicating that, on average, students receiving traditional instruction supplemented with CAI attained higher mathematics achievement than did 59.48 percent of those receiving traditional instruction per se.

Keywords: CAI, science, meta-analysis, traditional

Procedia PDF Downloads 159
16457 Dynamic Modeling of Advanced Wastewater Treatment Plants Using BioWin

Authors: Komal Rathore, Aydin Sunol, Gita Iranipour, Luke Mulford

Abstract:

Advanced wastewater treatment plants have complex biological kinetics, time variant influent flow rates and long processing times. Due to these factors, the modeling and operational control of advanced wastewater treatment plants become complicated. However, development of a robust model for advanced wastewater treatment plants has become necessary in order to increase the efficiency of the plants, reduce energy costs and meet the discharge limits set by the government. A dynamic model was designed using the Envirosim (Canada) platform software called BioWin for several wastewater treatment plants in Hillsborough County, Florida. Proper control strategies for various parameters such as mixed liquor suspended solids, recycle activated sludge and waste activated sludge were developed for models to match the plant performance. The models were tuned using both the influent and effluent data from the plant and their laboratories. The plant SCADA was used to predict the influent wastewater rates and concentration profiles as a function of time. The kinetic parameters were tuned based on sensitivity analysis and trial and error methods. The dynamic models were validated by using experimental data for influent and effluent parameters. The dissolved oxygen measurements were taken to validate the model by coupling them with Computational Fluid Dynamics (CFD) models. The Biowin models were able to exactly mimic the plant performance and predict effluent behavior for extended periods. The models are useful for plant engineers and operators as they can take decisions beforehand by predicting the plant performance with the use of BioWin models. One of the important findings from the model was the effects of recycle and wastage ratios on the mixed liquor suspended solids. The model was also useful in determining the significant kinetic parameters for biological wastewater treatment systems.

Keywords: BioWin, kinetic modeling, flowsheet simulation, dynamic modeling

Procedia PDF Downloads 142
16456 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements

Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang

Abstract:

Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.

Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure

Procedia PDF Downloads 105
16455 Modelling of Relocation and Battery Autonomy Problem on Electric Cars Sharing Dynamic by Using Discrete Event Simulation and Petri Net

Authors: Taha Benarbia, Kay W. Axhausen, Anugrah Ilahi

Abstract:

Electric car sharing system as ecologic transportation increasing in the world. The complexity of managing electric car sharing systems, especially one-way trips and battery autonomy have direct influence to on supply and demand of system. One must be able to precisely model the demand and supply of these systems to better operate electric car sharing and estimate its effect on mobility management and the accessibility that it provides in urban areas. In this context, our work focus to develop performances optimization model of the system based on discrete event simulation and stochastic Petri net. The objective is to search optimal decisions and management parameters of the system in order to fulfil at best demand while minimizing undesirable situations. In this paper, we present new model of electric cars sharing with relocation based on monitoring system. The proposed approach also help to precise the influence of battery charging level on the behaviour of system as important decision parameter of this complex and dynamical system.

Keywords: electric car-sharing systems, smart mobility, Petri nets modelling, discrete event simulation

Procedia PDF Downloads 165
16454 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.

Keywords: apartment complex, big data, life-cycle building value analysis, machine learning

Procedia PDF Downloads 362
16453 Relationships Between the Petrophysical and Mechanical Properties of Rocks and Shear Wave Velocity

Authors: Anamika Sahu

Abstract:

The Himalayas, like many mountainous regions, is susceptible to multiple hazards. In recent times, the frequency of such disasters is continuously increasing due to extreme weather phenomena. These natural hazards are responsible for irreparable human and economic loss. The Indian Himalayas has repeatedly been ruptured by great earthquakes in the past and has the potential for a future large seismic event as it falls under the seismic gap. Damages caused by earthquakes are different in different localities. It is well known that, during earthquakes, damage to the structure is associated with the subsurface conditions and the quality of construction materials. So, for sustainable mountain development, prior estimation of site characterization will be valuable for designing and constructing the space area and for efficient mitigation of the seismic risk. Both geotechnical and geophysical investigation of the subsurface is required to describe the subsurface complexity. In mountainous regions, geophysical methods are gaining popularity as areas can be studied without disturbing the ground surface, and also these methods are time and cost-effective. The MASW method is used to calculate the Vs30. Vs30 is the average shear wave velocity for the top 30m of soil. Shear wave velocity is considered the best stiffness indicator, and the average of shear wave velocity up to 30 m is used in National Earthquake Hazards Reduction Program (NEHRP) provisions (BSSC,1994) and Uniform Building Code (UBC), 1997 classification. Parameters obtained through geotechnical investigation have been integrated with findings obtained through the subsurface geophysical survey. Joint interpretation has been used to establish inter-relationships among mineral constituents, various textural parameters, and unconfined compressive strength (UCS) with shear wave velocity. It is found that results obtained through the MASW method fitted well with the laboratory test. In both conditions, mineral constituents and textural parameters (grain size, grain shape, grain orientation, and degree of interlocking) control the petrophysical and mechanical properties of rocks and the behavior of shear wave velocity.

Keywords: MASW, mechanical, petrophysical, site characterization

Procedia PDF Downloads 75
16452 Impacts of Climate Elements on the Annual Periodic Behavior of the Shallow Groundwater Level: Case Study from Central-Eastern Europe

Authors: Tamas Garamhegyi, Jozsef Kovacs, Rita Pongracz, Peter Tanos, Balazs Trasy, Norbert Magyar, Istvan G. Hatvani

Abstract:

Like most environmental processes, shallow groundwater fluctuation under natural circumstances also behaves periodically. With the statistical tools at hand, it can easily be determined if a period exists in the data or not. Thus, the question may be raised: Does the estimated average period time characterize the whole time period, or not? This is especially important in the case of such complex phenomena as shallow groundwater fluctuation, driven by numerous factors. Because of the continuous changes in the oscillating components of shallow groundwater time series, the most appropriate method should be used to investigate its periodicity, this is wavelet spectrum analysis. The aims of the research were to investigate the periodic behavior of the shallow groundwater time series of an agriculturally important and drought sensitive region in Central-Eastern Europe and its relationship to the European pressure action centers. During the research ~216 shallow groundwater observation wells located in the eastern part of the Great Hungarian Plain with a temporal coverage of 50 years were scanned for periodicity. By taking the full-time interval as 100%, the presence of any period could be determined in percentages. With the complex hydrogeological/meteorological model developed in this study, non-periodic time intervals were found in the shallow groundwater levels. On the local scale, this phenomenon linked to drought conditions, and on a regional scale linked to the maxima of the regional air pressures in the Gulf of Genoa. The study documented an important link between shallow groundwater levels and climate variables/indices facilitating the necessary adaptation strategies on national and/or regional scales, which have to take into account the predictions of drought-related climatic conditions.

Keywords: climate change, drought, groundwater periodicity, wavelet spectrum and coherence analyses

Procedia PDF Downloads 378
16451 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 82
16450 Knowledge Management Best Practice Model in Higher Learning Institution: A Systematic Literature Review

Authors: Ismail Halijah, Abdullah Rusli

Abstract:

Introduction: This systematic literature review aims to identify the Knowledge Management Best Practice components in the Knowledge Management Model for Higher Learning Institutions environment. Study design: Systematic literature review. Methods: A systematic literature re-view of Knowledge Management Best Practice to identify and define the components of Best Practice from the Knowledge Management models was conducted recently. Results: This review of published papers of conference and journals’ articles shows the components of Best Practice in Knowledge Management are basically divided into two aspect which is the soft aspect and the hard aspect. The lacks of combination of these two aspects into an integrated model decelerate Knowledge Management Best Practice to fully throttle. Evidence from the literature shows the lack of integration of this two aspects leads to the immaturity of the Higher Learning Institution (HLI) towards the implementation of Knowledge Management System. Conclusion: The first steps of identifying the attributes to measure the Knowledge Management Best Practice components from the models in the literature will led to the definition of the Knowledge Management Best Practice component for the higher learning environment.

Keywords: knowledge management, knowledge management system, knowledge management best practice, knowledge management higher learning institution

Procedia PDF Downloads 578
16449 The Resistance Reader Program Based on Image Processing

Authors: Janpen Srijan, Nahathai Tanmang, Thanit Purathanang, Anun Dowchern, Saksit Summart, Seangduan Kampimpa

Abstract:

This paper presents the resistance reader program based on image processing by using MATLAB. The proposed program is divided into six parts; the first part is the web camera; the second part is a watt selection before shooting the resistor; the third part is a part of finding the position of the color on the mid-point of resistor; the fourth part is a part of identifying color code of the resistor; the fifth part is a part of taking the number of values for each color for resistance calculation and the last part is a part of displaying result of resistance value. The experimental result of the resistance reader program based on image processing was able to display the resistance value of resistor. The accuracy of proposed program is 85 percent for 1 watt resistor. It has 15 percent of reading error because a problem with the color code of some resistor was too bright.

Keywords: resistance reader program, image processing, resistor, MATLAB

Procedia PDF Downloads 369
16448 Integration of STEM Education in Quebec, Canada – Challenges and Opportunities

Authors: B. El Fadil, R. Najar

Abstract:

STEM education is promoted by many scholars and curricula around the world, but it is not yet well established in the province of Quebec in Canada. In addition, effective instructional STEM activities and design methods are required to ensure that students and teachers' needs are being met. One potential method is the Engineering Design Process (EDP), a methodology that emphasizes the importance of creativity and collaboration in problem-solving strategies. This article reports on a case study that focused on using the EDP to develop instructional materials by means of making a technological artifact to teach mathematical variables and functions at the secondary level. The five iterative stages of the EDP (design, make, test, infer, and iterate) were integrated into the development of the course materials. Data was collected from different sources: pre- and post-questionnaires, as well as a working document dealing with pupils' understanding based on designing, making, testing, and simulating. Twenty-four grade seven (13 years old) students in Northern Quebec participated in the study. The findings of this study indicate that STEM activities have a positive impact not only on students' engagement in classroom activities but also on learning new mathematical concepts. Furthermore, STEM-focused activities have a significant effect on problem-solving skills development in an interdisciplinary approach. Based on the study's results, we can conclude, inter alia, that teachers should integrate STEM activities into their teaching practices to increase learning outcomes and attach more importance to STEM-focused activities to develop students' reflective thinking and hands-on skills.

Keywords: engineering design process, motivation, stem, integration, variables, functions

Procedia PDF Downloads 79
16447 A Newspapers Expectations Indicator from Web Scraping

Authors: Pilar Rey del Castillo

Abstract:

This document describes the building of an average indicator of the general sentiments about the future exposed in the newspapers in Spain. The raw data are collected through the scraping of the Digital Periodical and Newspaper Library website. Basic tools of natural language processing are later applied to the collected information to evaluate the sentiment strength of each word in the texts using a polarized dictionary. The last step consists of summarizing these sentiments to produce daily indices. The results are a first insight into the applicability of these techniques to produce periodic sentiment indicators.

Keywords: natural language processing, periodic indicator, sentiment analysis, web scraping

Procedia PDF Downloads 117
16446 Periodic Topology and Size Optimization Design of Tower Crane Boom

Authors: Wu Qinglong, Zhou Qicai, Xiong Xiaolei, Zhang Richeng

Abstract:

In order to achieve the layout and size optimization of the web members of tower crane boom, a truss topology and cross section size optimization method based on continuum is proposed considering three typical working conditions. Firstly, the optimization model is established by replacing web members with web plates. And the web plates are divided into several sub-domains so that periodic soft kill option (SKO) method can be carried out for topology optimization of the slender boom. After getting the optimized topology of web plates, the optimized layout of web members is formed through extracting the principal stress distribution. Finally, using the web member radius as design variable, the boom compliance as objective and the material volume of the boom as constraint, the cross section size optimization mathematical model is established. The size optimization criterion is deduced from the mathematical model by Lagrange multiplier method and Kuhn-Tucker condition. By comparing the original boom with the optimal boom, it is identified that this optimization method can effectively lighten the boom and improve its performance.

Keywords: tower crane boom, topology optimization, size optimization, periodic, SKO, optimization criterion

Procedia PDF Downloads 542
16445 The Effect of Technology- facilitated Lesson Study toward Teacher’s Computer Assisted Language Learning Competencies

Authors: Yi-Ning Chang

Abstract:

With the rapid advancement of technology, it has become crucial for educators to adeptly integrate technology into their teaching and develop a robust Computer-Assisted Language Learning (CALL) competency. Addressing this need, the present study adopted a technology-based Lesson Study approach to assess its impact on the CALL competency and professional capabilities of EFL teachers. Additionally, the study delved into teachers' perceptions of the benefits derived from participating in the creation of technologically integrated lesson plans. The iterative process of technology-based Lesson Study facilitated ample peer discussion, enabling teachers to flexibly design and implement lesson plans that incorporate various technological tools. This 15-week study included 10 in- service teachers from a university of science and technology in the central of Taiwan. The collected data included pre- and post- lesson planning scores, pre- and post- TPACK survey scores, classroom observation forms, designed lesson plans, and reflective essays. The pre- and post- lesson planning and TPACK survey scores were analyzed employing a pair-sampled t test; students’ reflective essays were respectively analyzed applying content analysis. The findings revealed that the teachers’ lesson planning ability and CALL competencies were improved. Teachers perceived a better understanding of integrating technology with teaching subjects, more effective teaching skills, and a deeper understanding of technology. Pedagogical implications and future studies are also discussed.

Keywords: CALL, language learning, lesson study, lesson plan

Procedia PDF Downloads 18
16444 Understanding Learning Styles of Hong Kong Tertiary Students for Engineering Education

Authors: K. M. Wong

Abstract:

Engineering education is crucial to technological innovation and advancement worldwide by generating young talents who are able to integrate scientific principles and design practical solutions for real-world problems. Graduates of engineering curriculums are expected to demonstrate an extensive set of learning outcomes as required in international accreditation agreements for engineering academic qualifications, such as the Washington Accord and the Sydney Accord. On the other hand, students have different learning preferences of receiving, processing and internalizing knowledge and skills. If the learning environment is advantageous to the learning styles of the students, there is a higher chance that the students can achieve the intended learning outcomes. With proper identification of the learning styles of the students, corresponding teaching strategies can then be developed for more effective learning. This research was an investigation of learning styles of tertiary students studying higher diploma programmes in Hong Kong. Data from over 200 students in engineering programmes were collected and analysed to identify the learning characteristics of students. A small-scale longitudinal study was then started to gather academic results of the students throughout their two-year engineering studies. Preliminary results suggested that the sample students were reflective, sensing, visual, and sequential learners. Observations from the analysed data not only provided valuable information for teachers to design more effective teaching strategies, but also provided data for further analysis with the students’ academic results. The results generated from the longitudinal study shed light on areas of improvement for more effective engineering curriculum design for better teaching and learning.

Keywords: learning styles, learning characteristics, engineering education, vocational education, Hong Kong

Procedia PDF Downloads 255