Search results for: ordering time
16547 Influence of Flight Design on Discharging Profiles of Granular Material in Rotary Dryer
Authors: I. Benhsine, M. Hellou, F. Lominé, Y. Roques
Abstract:
During the manufacture of fertilizer, it is necessary to add water for granulation purposes. The water content is then removed or reduced using rotary dryers. They are commonly used to dry wet granular materials and they are usually fitted with lifting flights. The transport of granular materials occurs when particles cascade from the lifting flights and fall into the air stream. Each cascade consists of a lifting and a falling cycle. Lifting flights are thus of great importance for the transport of granular materials along the dryer. They also enhance the contact between solid particles and the air stream. Optimization of the drying process needs an understanding of the behavior of granular materials inside a rotary dryer. Different approaches exist to study the movement of granular materials inside the dryer. Most common of them are based on empirical formulations or on study the movement of the bulk material. In the present work, we are interested in the behavior of each particle in the cross section of the dryer using Discrete Element Method (DEM) to understand. In this paper, we focus on studying the hold-up, the cascade patterns, the falling time and the falling length of the particles leaving the flights. We will be using two segment flights. Three different profiles are used: a straight flight (180° between both segments), an angled flight (with an angle of 150°), and a right-angled flight (90°). The profile of the flight affects significantly the movement of the particles in the dryer. Changing the flight angle changes the flight capacity which leads to different discharging profile of the flight, thus affecting the hold-up in the flight. When the angle of the flight is reduced, the range of the discharge angle increases leading to a more uniformed cascade pattern in time. The falling length and the falling time of the particles also increase up to a maximum value then they start decreasing. Moreover, the results show an increase in the falling length and the falling time up to 70% and 50%, respectively, when using a right-angled flight instead of a straight one.Keywords: discrete element method, granular materials, lifting flight, rotary dryer
Procedia PDF Downloads 32716546 Perceptual Organization within Temporal Displacement
Authors: Michele Sinico
Abstract:
The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization
Procedia PDF Downloads 25116545 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 29816544 Impact of the Operation and Infrastructure Parameters to the Railway Track Capacity
Authors: Martin Kendra, Jaroslav Mašek, Juraj Čamaj, Matej Babin
Abstract:
The railway transport is considered as a one of the most environmentally friendly mode of transport. With future prediction of increasing of freight transport there are lines facing problems with demanded capacity. Increase of the track capacity could be achieved by infrastructure constructive adjustments. The contribution shows how the travel time can be minimized and the track capacity increased by changing some of the basic infrastructure and operation parameters, for example, the minimal curve radius of the track, the number of tracks, or the usable track length at stations. Calculation of the necessary parameter changes is based on the fundamental physical laws applied to the train movement, and calculation of the occupation time is dependent on the changes of controlling the traffic between the stations.Keywords: curve radius, maximum curve speed, track mass capacity, reconstruction
Procedia PDF Downloads 33416543 Effect of Different Temperatures and Cold Storage on Pupaes Apanteles gelechiidivoris Marsh (Hymenoptera: Braconidae) Parasitoid of Tuta absoluta Meyrick (Lepidoptera: Gelechiidae)
Authors: Jessica Morales Perdomo, Daniel Rodriguez Caicedo, Fernando Cantor Rincon
Abstract:
Tuta absoluta known as the tomato leaf miner, is one of the main pests in tomato crops in South America and the main pest in many European countries. Apanteles gelechiidivoris is a parasitoid of third instar Tuta absoluta larvae. Our studies have demonstrated that this parasitoid can cause up to 80% mortality of T. absoluta larvae in the field. We investigated cold storage of A. gelechiidivoris pupae as a method of mass production of this parasitoid. This storage method does not interfere with biological characteristics of the parasitoid. In this study, we evaluated the effect of different temperatures (4, 8 and 12°C) and different time duration (7, 14, 21 or 28 days) of cold storage on biological parameters of A. gelechiidivoris pupae and adults. The biological parameters of the parasitoid evaluated were: adult emergence time, lifespan, parasitism percentage and sex ratio. We found that the adult emergence time was delayed when the parasitoid pupae were stored at 4°C and 8°C. The shortest adult emergence was recorded when pupae were stored for seven days. The lowest adult emergence was found for pupae stored at 4°C and decreased significantly as the days of storage increased. We found high percentages of adult emergence when pupae were stored at 8°C and 12°C for seven days. Adult lifespan decreased with increasing days of cold storage. Adults emerging from pupae stored at 8°C during seven and 14 days showed the longest lifespan (nine days). The lowest parasitism rate was recorded at 4°C at every time point. The highest percentage of parasitism (80%) was found at 8°C during seven days of storage. The treatments had no effect on adults the sex ratio. The results suggest that A. gelechiidivoris pupae can be stored for up to 14 days at 8°C without affecting the efficacy of the parasitoid in the field.Keywords: biological control, cold storage, massive rearing, quality control
Procedia PDF Downloads 37316542 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans
Authors: Jelena Vucicevic
Abstract:
Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.Keywords: censored data, R statistical software, reliability analysis, time to failure
Procedia PDF Downloads 40116541 Lived Experiences of Physical Education Teachers in the New Normal: A Consensual Qualitative Research
Authors: Karl Eddie T. Malabanan
Abstract:
Due to the quick transmission and public health risk of coronavirus disease, schools and universities have shifted to distant learning. Teachers everywhere were forced to shift gears instantly in order to react to the needs of students and families using synchronous and asynchronous virtual teaching. This study aims to explore the lived experiences of physical education teachers who are currently experiencing remote learning in teaching during the time of the COVID-19 pandemic. Specifically, the challenges that the physical education teachers encounter during remote learning and teaching. The participants include 12 physical education teachers who have taught in higher education institutions for at least five years. The researcher utilized qualitative research; specifically, the researcher used Consensual Qualitative Research (CQR). The results of this study showed that there are five categories for the Lived Experiences of Physical Education Teachers with thirty-one subcategories. This study revealed that physical education teachers experienced very challenging situations during the time of the pandemic. It also found that students had challenges in the abrupt transition from traditional to virtual learning classes, but it also showed that students are tenacious and willing to face any adversity. The researcher also finds that teachers are mentally drained during this time. Furthermore, one of the main focuses for the teachers should be on improving their well-being. And lastly, to cope with the challenges, teachers employ socializing to relieve tension and anxiety.Keywords: lived experiences, consensual qualitative research, pandemic, education
Procedia PDF Downloads 9216540 The Analysis Fleet Operational Performance as an Indicator of Load and Haul Productivity
Authors: Linet Melisa Daubanes, Nhleko Monique Chiloane
Abstract:
The shovel-truck system is the most prevalent material handling system used in surface mining operations. Material handling entails the loading and hauling of material from production areas to dumping areas. The material handling process has operational delays that have a negative impact on the productivity of the load and haul fleet. Factors that may contribute to operational delays include shovel-truck mismatch, haul routes, machine breakdowns, extreme weather conditions, etc. The aim of this paper is to investigate factors that contribute to operational delays affecting the productivity of the load and haul fleet at the mine. Productivity is the measure of the effectiveness of producing products from a given quantity of units, the ratio of output to inputs. Productivity can be improved by producing more outputs with the same or fewer units and/or introducing better working methods etc. Several key performance indicators (KPI) for the evaluation of productivity will be discussed in this study. These KPIs include but are not limited to hauling conditions, bucket fill factor, cycle time, and utilization. The research methodology of this study is a combination of on-site time studies and observations. Productivity can be optimized by managing the factors that affect the operational performance of the haulage fleet.Keywords: cycle time, fleet performance, load and haul, surface mining
Procedia PDF Downloads 19616539 The Visualizer for Real-Time Analysis of Internet Trends
Authors: Radek Malinský, Ivan Jelínek
Abstract:
The current web has become a modern encyclopedia, where people share their thoughts and ideas on various topics around them. Such kind of encyclopedia is very useful for other people who are looking for answers to their questions. However, with the growing popularity of social networking and blogging and ever expanding network services, there has also been a growing diversity of technologies along with different structure of individual websites. It is, therefore, difficult to directly find a relevant answer for a common Internet user. This paper presents a web application for the real-time end-to-end analysis of selected Internet trends; where the trend can be whatever the people post online. The application integrates fully configurable tools for data collection and analysis using selected webometric algorithms, and for its chronological visualization to user. It can be assumed that the application facilitates the users to evaluate the quality of various products that are mentioned online.Keywords: Trend, visualizer, web analysis, web 2.0.
Procedia PDF Downloads 26416538 Deep Learning Framework for Predicting Bus Travel Times with Multiple Bus Routes: A Single-Step Multi-Station Forecasting Approach
Authors: Muhammad Ahnaf Zahin, Yaw Adu-Gyamfi
Abstract:
Bus transit is a crucial component of transportation networks, especially in urban areas. Any intelligent transportation system must have accurate real-time information on bus travel times since it minimizes waiting times for passengers at different stations along a route, improves service reliability, and significantly optimizes travel patterns. Bus agencies must enhance the quality of their information service to serve their passengers better and draw in more travelers since people waiting at bus stops are frequently anxious about when the bus will arrive at their starting point and when it will reach their destination. For solving this issue, different models have been developed for predicting bus travel times recently, but most of them are focused on smaller road networks due to their relatively subpar performance in high-density urban areas on a vast network. This paper develops a deep learning-based architecture using a single-step multi-station forecasting approach to predict average bus travel times for numerous routes, stops, and trips on a large-scale network using heterogeneous bus transit data collected from the GTFS database. Over one week, data was gathered from multiple bus routes in Saint Louis, Missouri. In this study, Gated Recurrent Unit (GRU) neural network was followed to predict the mean vehicle travel times for different hours of the day for multiple stations along multiple routes. Historical time steps and prediction horizon were set up to 5 and 1, respectively, which means that five hours of historical average travel time data were used to predict average travel time for the following hour. The spatial and temporal information and the historical average travel times were captured from the dataset for model input parameters. As adjacency matrices for the spatial input parameters, the station distances and sequence numbers were used, and the time of day (hour) was considered for the temporal inputs. Other inputs, including volatility information such as standard deviation and variance of journey durations, were also included in the model to make it more robust. The model's performance was evaluated based on a metric called mean absolute percentage error (MAPE). The observed prediction errors for various routes, trips, and stations remained consistent throughout the day. The results showed that the developed model could predict travel times more accurately during peak traffic hours, having a MAPE of around 14%, and performed less accurately during the latter part of the day. In the context of a complicated transportation network in high-density urban areas, the model showed its applicability for real-time travel time prediction of public transportation and ensured the high quality of the predictions generated by the model.Keywords: gated recurrent unit, mean absolute percentage error, single-step forecasting, travel time prediction.
Procedia PDF Downloads 7216537 Examination Scheduling System with Proposed Algorithm
Authors: Tabrej Khan
Abstract:
Examination Scheduling System (ESS) is a scheduling system that targets as an exam committee in any academic institute to help them in managing the exams automatically. We present an algorithm for Examination Scheduling System. Nowadays, many universities have challenges with creating examination schedule fast with less confliction compared to hand works. Our aims are to develop a computerized system that can be used in examination scheduling in an academic institute versus available resources (Time, Hall, Invigilator and instructor) with no contradiction and achieve fairness among students. ESS was developed using HTML, C# language, Crystal Report and ASP.NET through Microsoft Visual Studio 2010 as developing tools with integrated SQL server database. This application can produce some benefits such as reducing the time spent in creating an exam schedule and achieving fairness among studentsKeywords: examination scheduling system (ESS), algorithm, ASP.NET, crystal report
Procedia PDF Downloads 40416536 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture
Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán
Abstract:
Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing
Procedia PDF Downloads 9316535 Theoretical Prediction on the Lifetime of Sessile Evaporating Droplet in Blade Cooling
Authors: Yang Shen, Yongpan Cheng, Jinliang Xu
Abstract:
The effective blade cooling is of great significance for improving the performance of turbine. The mist cooling emerges as the promising way compared with the transitional single-phase cooling. In the mist cooling, the injected droplet will evaporate rapidly, and cool down the blade surface due to the absorbed latent heat, hence the lifetime for evaporating droplet becomes critical for design of cooling passages for the blade. So far there have been extensive studies on the droplet evaporation, but usually the isothermal model is applied for most of the studies. Actually the surface cooling effect can affect the droplet evaporation greatly, it can prolong the droplet evaporation lifetime significantly. In our study, a new theoretical model for sessile droplet evaporation with surface cooling effect is built up in toroidal coordinate. Three evaporation modes are analyzed during the evaporation lifetime, include “Constant Contact Radius”(CCR) mode、“Constant Contact Angle”(CCA) mode and “stick-slip”(SS) mode. The dimensionless number E0 is introduced to indicate the strength of the evaporative cooling, it is defined based on the thermal properties of the liquid and the atmosphere. Our model can predict accurately the lifetime of evaporation by validating with available experimental data. Then the temporal variation of droplet volume, contact angle and contact radius are presented under CCR, CCA and SS mode, the following conclusions are obtained. 1) The larger the dimensionless number E0, the longer the lifetime of three evaporation cases is; 2) The droplet volume over time still follows “2/3 power law” in the CCA mode, as in the isothermal model without the cooling effect; 3) In the “SS” mode, the large transition contact angle can reduce the evaporation time in CCR mode, and increase the time in CCA mode, the overall lifetime will be increased; 4) The correction factor for predicting instantaneous volume of the droplet is derived to predict the droplet life time accurately. These findings may be of great significance to explore the dynamics and heat transfer of sessile droplet evaporation.Keywords: blade cooling, droplet evaporation, lifetime, theoretical analysis
Procedia PDF Downloads 14216534 Automated End-to-End Pipeline Processing Solution for Autonomous Driving
Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi
Abstract:
Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing
Procedia PDF Downloads 12316533 Robotic Arm Control with Neural Networks Using Genetic Algorithm Optimization Approach
Authors: Arbnor Pajaziti, Hasan Cana
Abstract:
In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.Keywords: robotic arm, neural network, genetic algorithm, optimization
Procedia PDF Downloads 52316532 Developing an AI-Driven Application for Real-Time Emotion Recognition from Human Vocal Patterns
Authors: Sayor Ajfar Aaron, Mushfiqur Rahman, Sajjat Hossain Abir, Ashif Newaz
Abstract:
This study delves into the development of an artificial intelligence application designed for real-time emotion recognition from human vocal patterns. Utilizing advanced machine learning algorithms, including deep learning and neural networks, the paper highlights both the technical challenges and potential opportunities in accurately interpreting emotional cues from speech. Key findings demonstrate the critical role of diverse training datasets and the impact of ambient noise on recognition accuracy, offering insights into future directions for improving robustness and applicability in real-world scenarios.Keywords: artificial intelligence, convolutional neural network, emotion recognition, vocal patterns
Procedia PDF Downloads 5316531 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.Keywords: hard disk drive, line balancing, ECRS, simulation, arena program
Procedia PDF Downloads 22616530 Pollutants Removal from Synthetic Wastewater by the Combined Electrochemical Sequencing Batch Reactor
Authors: Amin Mojiri, Akiyoshi Ohashi, Tomonori Kindaichi
Abstract:
Synthetic domestic wastewater was treated via combining treatment methods, including electrochemical oxidation, adsorption, and sequencing batch reactor (SBR). In the upper part of the reactor, an anode and a cathode (Ti/RuO2-IrO2) were organized in parallel for the electrochemical oxidation procedure. Sodium sulfate (Na2SO4) with a concentration of 2.5 g/L was applied as the electrolyte. The voltage and current were fixed on 7.50 V and 0.40 A, respectively. Then, 15% working value of the reactor was filled by activated sludge, and 85% working value of the reactor was added with synthetic wastewater. Powdered cockleshell, 1.5 g/L, was added in the reactor to do ion-exchange. Response surface methodology was employed for statistical analysis. Reaction time (h) and pH were considered as independent factors. A total of 97.0% biochemical oxygen demand, 99.9% phosphorous and 88.6% cadmium were eliminated at the optimum reaction time (80.0 min) and pH (6.4).Keywords: adsorption, electrochemical oxidation, metals, SBR
Procedia PDF Downloads 21016529 Avatar Creation for E-Learning
Authors: M. Najib Osman, Hanafizan Hussain, Sri Kusuma Wati Mohd Daud
Abstract:
Avatar was used as user’s symbol of identity in online communications such as Facebook, Twitter, online game, and portal community between unknown people. The development of this symbol is the use of animated character or avatar, which can engage learners in a way that draws them into the e-Learning experience. Immersive learning is one of the most effective learning techniques, and animated characters can help create an immersive environment. E-learning is an ideal learning environment using modern means of information technology, through the effective integration of information technology and the curriculum to achieve, a new learning style which can fully reflect the main role of the students to reform the traditional teaching structure thoroughly. Essential in any e-learning is the degree of interactivity for the learner, and whether the learner is able to study at any time, or whether there is a need for the learner to be online or in a classroom with other learners at the same time (synchronous learning). Ideally, e-learning should engage the learners, allowing them to interact with the course materials, obtaining feedback on their progress and assistance whenever it is required. However, the degree of interactivity in e-learning depends on how the course has been developed and is dependent on the software used for its development, and the way the material is delivered to the learner. Therefore, users’ accessibility that allows access to information at any time and places and their positive attitude towards e-learning such as having interacting with a good teacher and the creation of a more natural and friendly environment for e-learning should be enhanced. This is to motivate their learning enthusiasm and it has been the responsibility of educators to incorporate new technology into their ways of teaching.Keywords: avatar, e-learning, higher education, students' perception
Procedia PDF Downloads 41116528 Quick off the Mark with Achilles Tendon Rupture
Authors: Emily Moore, Andrew Gaukroger, Matthew Solan, Lucy Bailey, Alexandra Boxall, Andrew Carne, Chintu Gadamsetty, Charlotte Morley, Katy Western, Iwona Kolodziejczyk
Abstract:
Introduction: Rupture of the Achilles tendon is common and has a long recovery period. Most cases are managed non-operatively. Foot and Ankle Surgeons advise an ultrasound scan to check the gap between the torn ends. A large gap (with the ankle in equinus) is a relative indication for surgery. The definitive decision regarding surgical versus non-operative management can only be made once an ultrasound scan is undertaken and the patient is subsequently reviewed by a Foot and Ankle surgeon. To get to this point, the patient journey involves several hospital departments. In nearby trusts, patients reattend for a scan and go to the plaster room both before and after the ultrasound for removal and re-application of the cast. At a third visit to the hospital, the surgeon and patient discuss options for definitive treatment. It may take 2-3 weeks from the initial Emergency Department visit before the final treatment decision is made. This “wasted time” is ultimately added to the recovery period for the patient. In this hospital, Achilles rupture patients are seen in a weekly multidisciplinary OneStop Heel Pain clinic. This pathway was already efficient but subject to occasional frustrating delays if a key staff member was absent. A new pathway was introduced with the goal to reduce delays to a definitive treatment plan. Method: A retrospective series of Achilles tendon ruptures managed according to the 2019 protocol was identified. Time taken from the Emergency Department to have both an ultrasound scan and specialist Foot and Ankle surgical review were calculated. 30 consecutive patients were treated with our new pathway and prospectively followed. The time taken for a scan and for specialist review were compared to the 30 consecutive cases from the 2019 (pre-COVID) cohort. The new pathway includes 1. A new contoured splint applied to the front of the injured limb held with a bandage. This can be removed and replaced (unlike a plaster cast) in the ultrasound department, removing the need for plaster room visits. 2. Urgent triage to a Foot and Ankle specialist. 3. Ultrasound scan for assessment of rupture gap and deep vein thrombosis check. 4. Early decision regarding surgery. Transfer to weight bearing in a prosthetic boot in equinuswithout waiting for the once-a-week clinic. 5. Extended oral VTE prophylaxis. Results: The time taken for a patient to have both an ultrasound scan and specialist review fell > 50%. All patients in the new pathway reached a definitive treatment decision within one week. There were no significant differences in patient demographics or rates of surgical vs non-operative treatment. The mean time from Emergency Department visit to specialist review and ultrasound scan fell from 8.7 days (old protocol) to 2.9 days (new pathway). The maximum time for this fell from 23 days (old protocol) to 6 days (new pathway). Conclusion: Teamwork and innovation have improved the experience for patients with an Achilles tendon rupture. The new pathway brings many advantages - reduced time in the Emergency Department, fewer hospital visits, less time using crutches and reduced overall recovery time.Keywords: orthopaedics, achilles rupture, ultrasound, innovation
Procedia PDF Downloads 12316527 Geosynthetic Tubes in Coastal Structures a Better Substitute for Shorter Planning Horizon: A Case Study
Authors: A. Pietro Rimoldi, B. Anilkumar Gopinath, C. Minimol Korulla
Abstract:
Coastal engineering structure is conventionally designed for a shorter planning horizon usually 20 years. These structures are subjected to different offshore climatic externalities like waves, tides, tsunamis etc. during the design life period. The probability of occurrence of these different offshore climatic externalities varies. The impact frequently caused by these externalities on the structures is of concern because it has a significant bearing on the capital /operating cost of the project. There can also be repeated short time occurrence of these externalities in the assumed planning horizon which can cause heavy damage to the conventional coastal structure which are mainly made of rock. A replacement of the damaged portion to prevent complete collapse is time consuming and expensive when dealing with hard rock structures. But if coastal structures are made of Geo-synthetic containment systems such replacement is quickly possible in the time period between two successive occurrences. In order to have a better knowledge and to enhance the predictive capacity of these occurrences, this study estimates risk of encounter within the design life period of various externalities based on the concept of exponential distribution. This gives an idea of the frequency of occurrences which in turn gives an indication of whether replacement is necessary and if so at what time interval such replacements have to be effected. To validate this theoretical finding, a pilot project has been taken up in the field so that the impact of the externalities can be studied both for a hard rock and a Geosynthetic tube structure. The paper brings out the salient feature of a case study which pertains to a project in which Geosynthetic tubes have been used for reformation of a seawall adjacent to a conventional rock structure in Alappuzha coast, Kerala, India. The effectiveness of the Geosystem in combatting the impact of the short-term externalities has been brought out.Keywords: climatic externalities, exponential distribution, geosystems, planning horizon
Procedia PDF Downloads 22816526 The Relationship between the Content of Inner Human Experience and Well-Being: An Experience Sampling Study
Authors: Xinqi Guo, Karen R. Dobkins
Abstract:
Background and Objectives: Humans are probably the only animals whose minds are constantly filled with thoughts, feelings and emotions. Previous studies have investigated human minds from different dimensions, including its proportion of time for not being present, its representative format, its personal relevance, its temporal locus, and affect valence. The current study aims at characterizing human mind by employing Experience Sampling Methods (ESM), a self-report research procedure for studying daily experience. This study emphasis on answering the following questions: 1) How does the contents of the inner experience vary across demographics, 2) Are certain types of inner experiences correlated with level of mindfulness and mental well-being (e.g., are people who spend more time being present happier, and are more mindful people more at-present?), 3) Will being prompted to report one’s inner experience increase mindfulness and mental well-being? Methods: Participants were recruited from the subject pool of UC San Diego or from the social media. They began by filling out two questionnaires: 1) Five Facet Mindfulness Questionnaire-Short Form, and 2) Warwick-Edinburgh Mental Well-being Scale, and demographic information. Then they participated in the ESM part by responding to the prompts which contained questions about their real-time inner experience: if they were 'at-present', 'mind-wandering', or 'zoned-out'. The temporal locus, the clarity, and the affect valence, and the personal importance of the thought they had the moment before the prompt were also assessed. A mobile app 'RealLife Exp' randomly delivered these prompts 3 times/day for 6 days during wake-time. After the 6 days, participants completed questionnaire (1) and (2) again. Their changes of score were compared to a control group who did not participate in the ESM procedure (yet completed (1) and (2) one week apart). Results: Results are currently preliminary as we continue to collect data. So far, there is a trend that participants are present, mind-wandering and zoned-out, about 53%, 23% and 24% during wake-time, respectively. The thoughts of participants are ranked to be clearer and more neutral if they are present vs. mind-wandering. Mind-wandering thoughts are 66% about the past, consisting 80% of inner speech. Discussion and Conclusion: This study investigated the subjective account of human mind by a tool with high ecological validity. And it broadens the understanding of the relationship between contents of mind and well-being.Keywords: experience sampling method, meta-memory, mindfulness, mind-wandering
Procedia PDF Downloads 13216525 GC and GCxGC-MS Composition of Volatile Compounds from Cuminum cyminum and Carum carvi by Using Techniques Assisted by Microwaves
Authors: F. Benkaci-Ali, R. Mékaoui, G. Scholl, G. Eppe
Abstract:
The new methods as accelerated steam distillation assisted by microwave (ASDAM) is a combination of microwave heating and steam distillation, performed at atmospheric pressure at very short extraction time. Isolation and concentration of volatile compounds are performed by a single stage. (ASDAM) has been compared with (ASDAM) with cryogrinding of seeds (CG) and a conventional technique, hydrodistillation assisted by microwave (HDAM), hydro-distillation (HD) for the extraction of essential oil from aromatic herb as caraway and cumin seeds. The essential oils extracted by (ASDAM) for 1 min were quantitatively (yield) and qualitatively (aromatic profile) no similar to those obtained by ASDAM-CG (1 min) and HD (for 3 h). The accelerated microwave extraction with cryogrinding inhibits numerous enzymatic reactions as hydrolysis of oils. Microwave radiations constitute the adequate mean for the extraction operations from the yields and high content in major component majority point view, and allow to minimise considerably the energy consumption, but especially heating time too, which is one of essential parameters of artifacts formation. The ASDAM and ASDAM-CG are green techniques and yields an essential oil with higher amounts of more valuable oxygenated compounds comparable to the biosynthesis compounds, and allows substantial savings of costs, in terms of time, energy and plant material.Keywords: microwave, steam distillation, caraway, cumin, cryogrinding, GC-MS, GCxGC-MS
Procedia PDF Downloads 25816524 Monitoring CO2 and H2S Emission in Live Austrian and UK Concrete Sewer Pipes
Authors: Anna Romanova, Morteza A. Alani
Abstract:
Corrosion of concrete sewer pipes induced by sulfuric acid is an acknowledged problem and a ticking time-bomb to sewer operators. Whilst the chemical reaction of the corrosion process is well-understood, the indirect roles of other parameters in the corrosion process which are found in sewer environment are not highly reflected on. This paper reports on a field studies undertaken in Austria and United Kingdom, where the parameters of temperature, pH, H2S and CO2 were monitored over a period of time. The study establishes that (i) effluent temperature and pH have similar daily pattern and peak times, When examined in minutes scale, (ii) H2S and CO2 have an identical hourly pattern, (iii) H2S instant or shifted relation to effluent temperature is governed by the root mean square value of CO2.Keywords: concrete corrosion, carbon dioxide, hydrogen sulphide, sewer pipe, sulfuric acid
Procedia PDF Downloads 30716523 Scheduling of Bus Fleet Departure Time Based on Mathematical Model of Number of Bus Stops for Municipality Bus Organization
Authors: Ali Abdi Kordani, Hamid Bigdelirad, Sid Mohammad Boroomandrad
Abstract:
Operating Urban Bus Transit System is a phenomenon that has a major role in transporting passengers in cities. There are many factors involved in planning and operating an Urban Bus Transit System, one of which is selecting optimized number of stops and scheduling of bus fleet departure. In this paper, we tried to introduce desirable methodology to select number of stops and schedule properly. Selecting the right number of stops causes convenience in accessibility and reduction in travel time and finally increase in public preference of this transportation mode. The achieved results revealed that number of stops must reduce from 33 to 25. Also according to scheduling and conducted economic analysis, the number of buses must decrease from 17 to 11 to have the most appropriate status for the Bus Organization.Keywords: number of optimized stops, organizing bus system, scheduling, urban transit
Procedia PDF Downloads 12316522 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.Keywords: data grid, data replication, simulation, replica selection, replica placement
Procedia PDF Downloads 26016521 The Efficacy of Box Lesion+ Procedure in Patients with Atrial Fibrillation: Two-Year Follow-up Results
Authors: Oleg Sapelnikov, Ruslan Latypov, Darina Ardus, Samvel Aivazian, Andrey Shiryaev, Renat Akchurin
Abstract:
OBJECTIVE: MAZE procedure is one of the most effective surgical methods in atrial fibrillation (AF) treatment. Nowadays we are all aware of its modifications. In our study we conducted clinical analysis of “Box lesion+” approach during MAZE procedure in two-year follow-up. METHODS: We studied the results of the open-heart on-pump procedures performed in our hospital from 2017 to 2018 years. Thirty-two (32) patients with atrial fibrillation (AF) were included in this study. Fifteen (15) patients had concomitant coronary bypass grafting and seventeen (17) patients had mitral valve repair. Mean age was 62.3±8.7 years; prevalence of men was admitted (56.1%). Mean duration of AF was 4.75±5.44 and 7.07±8.14 years. In all cases, we performed endocardial Cryo-MAZE procedure with one-time myocardium revascularization or mitral-valve surgery. All patients of this study underwent pulmonary vein (PV) isolation and ablation of mitral isthmus with additional isolation of LA posterior wall (Box-lesion+ procedure). Mean follow-up was 2 years. RESULTS: All cases were performed without any complications. Additional isolation of posterior wall did not prolong the operative time and artificial circulation significantly. Cryo-MAZE procedure directly lasted 20±2.1 min, the whole operation time was 192±24 min and artificial circulation time was 103±12 min. According to design of the study, we performed clinical investigation of the patients in 12 months and in 2 years from the initial procedure. In 12 months, the number of AF free patients 81.8% and 75.8% in two years of follow-up. CONCLUSIONS: Isolation of the left atrial posterior wall and perimitral area may considerably improve the efficacy of surgical treatment, which was demonstrated in significant decrease of AF recurrences during the whole period of follow-up.Keywords: atrial fibrillation, cryoablation, left atrium isolation, open heart procedure
Procedia PDF Downloads 12816520 Impact of the Time Interval in the Numerical Solution of Incompressible Flows
Authors: M. Salmanzadeh
Abstract:
In paper, we will deal with incompressible Couette flow, which represents an exact analytical solution of the Navier-Stokes equations. Couette flow is perhaps the simplest of all viscous flows, while at the same time retaining much of the same physical characteristics of a more complicated boundary-layer flow. The numerical technique that we will employ for the solution of the Couette flow is the Crank-Nicolson implicit method. Parabolic partial differential equations lend themselves to a marching solution; in addition, the use of an implicit technique allows a much larger marching step size than would be the case for an explicit solution. Hence, in the present paper we will have the opportunity to explore some aspects of CFD different from those discussed in the other papers.Keywords: incompressible couette flow, numerical method, partial differential equation, Crank-Nicolson implicit
Procedia PDF Downloads 53616519 The Explanation for Dark Matter and Dark Energy
Authors: Richard Lewis
Abstract:
The following assumptions of the Big Bang theory are challenged and found to be false: the cosmological principle, the assumption that all matter formed at the same time and the assumption regarding the cause of the cosmic microwave background radiation. The evolution of the universe is described based on the conclusion that the universe is finite with a space boundary. This conclusion is reached by ruling out the possibility of an infinite universe or a universe which is finite with no boundary. In a finite universe, the centre of the universe can be located with reference to our home galaxy (The Milky Way) using the speed relative to the Cosmic Microwave Background (CMB) rest frame and Hubble's law. This places our home galaxy at a distance of approximately 26 million light years from the centre of the universe. Because we are making observations from a point relatively close to the centre of the universe, the universe appears to be isotropic and homogeneous but this is not the case. The CMB is coming from a source located within the event horizon of the universe. There is sufficient mass in the universe to create an event horizon at the Schwarzschild radius. Galaxies form over time due to the energy released by the expansion of space. Conservation of energy must consider total energy which is mass (+ve) plus energy (+ve) plus spacetime curvature (-ve) so that the total energy of the universe is always zero. The predominant position of galaxy formation moves over time from the centre of the universe towards the boundary so that today the majority of new galaxy formation is taking place beyond our horizon of observation at 14 billion light years.Keywords: cosmology, dark energy, dark matter, evolution of the universe
Procedia PDF Downloads 14116518 An Agent-Service Oriented Framework for Online Contracts in Virtual Organizations
Authors: Zahra Raeisi, Reza Akbari
Abstract:
Contracting is known as one of the important tasks in virtual organization creation. Contracting is a costly process in terms of time and effort. One way to cut the time and effort is conducting contract electronically. The online contracting enable us to form virtual organization (VO) dynamically. This work presents an agent-service oriented framework for online contracting in virtual organizations. The proposed framework considers the main aspects and steps of traditional contracting process and uses the efficiency of service and agent based methodologies in order to provide a flexible and efficient way to establish contracts electronically in a VO.Keywords: service oriented architecture, online contracts, agent-oriented architecture, virtual organization
Procedia PDF Downloads 504