Search results for: capability approach
11348 Drone Swarm Routing and Scheduling for Off-shore Wind Turbine Blades Inspection
Authors: Mohanad Al-Behadili, Xiang Song, Djamila Ouelhadj, Alex Fraess-Ehrfeld
Abstract:
In off-shore wind farms, turbine blade inspection accessibility under various sea states is very challenging and greatly affects the downtime of wind turbines. Maintenance of any offshore system is not an easy task due to the restricted logistics and accessibility. The multirotor unmanned helicopter is of increasing interest in inspection applications due to its manoeuvrability and payload capacity. These advantages increase when many of them are deployed simultaneously in a swarm. Hence this paper proposes a drone swarm framework for inspecting offshore wind turbine blades and nacelles so as to reduce downtime. One of the big challenges of this task is that when operating a drone swarm, an individual drone may not have enough power to fly and communicate during missions and it has no capability of refueling due to its small size. Once the drone power is drained, there are no signals transmitted and the links become intermittent. Vessels equipped with 5G masts and small power units are utilised as platforms for drones to recharge/swap batteries. The research work aims at designing a smart energy management system, which provides automated vessel and drone routing and recharging plans. To achieve this goal, a novel mathematical optimisation model is developed with the main objective of minimising the number of drones and vessels, which carry the charging stations, and the downtime of the wind turbines. There are a number of constraints to be considered, such as each wind turbine must be inspected once and only once by one drone; each drone can inspect at most one wind turbine after recharging, then fly back to the charging station; collision should be avoided during the drone flying; all wind turbines in the wind farm should be inspected within the given time window. We have developed a real-time Ant Colony Optimisation (ACO) algorithm to generate real-time and near-optimal solutions to the drone swarm routing problem. The schedule will generate efficient and real-time solutions to indicate the inspection tasks, time windows, and the optimal routes of the drones to access the turbines. Experiments are conducted to evaluate the quality of the solutions generated by ACO.Keywords: drone swarm, routing, scheduling, optimisation model, ant colony optimisation
Procedia PDF Downloads 26611347 Feedback Matrix Approach for Relativistic Runaway Electron Avalanches Dynamics in Complex Electric Field Structures
Authors: Egor Stadnichuk
Abstract:
Relativistic runaway electron avalanches (RREA) are a widely accepted source of thunderstorm gamma-radiation. In regions with huge electric field strength, RREA can multiply via relativistic feedback. The relativistic feedback is caused both by positron production and by runaway electron bremsstrahlung gamma-rays reversal. In complex multilayer thunderstorm electric field structures, an additional reactor feedback mechanism appears due to gamma-ray exchange between separate strong electric field regions with different electric field directions. The study of this reactor mechanism in conjunction with the relativistic feedback with Monte Carlo simulations or by direct solution of the kinetic Boltzmann equation requires a significant amount of computational time. In this work, a theoretical approach to study feedback mechanisms in RREA physics is developed. It is based on the matrix of feedback operators construction. With the feedback matrix, the problem of the dynamics of avalanches in complex electric structures is reduced to the problem of finding eigenvectors and eigenvalues. A method of matrix elements calculation is proposed. The proposed concept was used to study the dynamics of RREAs in multilayer thunderclouds.Keywords: terrestrial Gamma-ray flashes, thunderstorm ground enhancement, relativistic runaway electron avalanches, gamma-rays, high-energy atmospheric physics, TGF, TGE, thunderstorm, relativistic feedback, reactor feedback, reactor model
Procedia PDF Downloads 17211346 Synthetic Access to Complex Metal Carbonates and Hydroxycarbonates via Sol-Gel Chemistry
Authors: Schirin Hanf, Carlos Lizandara-Pueyo, Timmo P. Emmert, Ivana Jevtovikj, Roger Gläser, Stephan A. Schunk
Abstract:
Metal alkoxides are very versatile precursors for a broad array of complex functional materials. However, metal alkoxides, especially transition metal alkoxides, tend to form oligomeric structures due to the very strong M–O–M binding motif. This fact hinders their facile application in sol-gel-processes and complicates access to complex carbonate or oxidic compounds after hydrolysis of the precursors. Therefore, the development of a synthetic alternative with the aim to grant access to carbonates and hydroxycarbonates from simple metal alkoxide precursors via hydrolysis is key to this project. Our approach involves the reaction of metal alkoxides with unsaturated isoelectronic molecules, such as carbon dioxide. Subsequently, a stoichiometric insertion of the CO₂ into the alkoxide M–O bond takes place and leads to the formation of soluble metal alkyl carbonates. This strategy is a very elegant approach to solubilize metal alkoxide precursors to make them accessible for sol-gel chemistry. After hydrolysis of the metal alkyl carbonates, crystalline metal carbonates, and hydroxycarbonates can be obtained, which were then utilized for the synthesis of Cu/Zn based bulk catalysts for methanol synthesis. Using these catalysts, a comparable catalytic activity to commercially available MeOH catalysts could be reached. Based on these results, a complement for traditional precipitation techniques, which are usually utilized for the synthesis of bulk methanol catalysts, have been found based on an alternative solubilization strategy.Keywords: metal alkoxides, metal carbonates, metal hydroxycarbonates, CO₂ insertion, solubilization
Procedia PDF Downloads 18711345 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients
Authors: Ainura Tursunalieva, Irene Hudson
Abstract:
Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence
Procedia PDF Downloads 15211344 Mechanisms for the Art of Food: Tourism with Thainess and a Multi-Stakeholder Participation Approach
Authors: Jutamas Wisansing, Thanakarn Vongvisitsin, Udom Hongchatikul
Abstract:
Food could be used to open up a dialogue about local heritage. Contributing to the world sustainable consumption mission, this research aims to explore the linkages between agriculture, senses of place and performing arts. Thailand and its destination marketing ‘Discover Thainess’ was selected as a working principle, enabling a case example of how the three elements could be conceptualized. The model offered an integrated institutional arrangement where diverse entities could be formed to design how Thainess (local heritage) could be interpreted and embedded into an art of food. Using case study research approach, three areas (Chiangmai, Samutsongkram and Ban Rai Gong King) representing 3 different scales of tourism development were selected. Based on a theoretical analysis, a working model was formulated. An action research was then designed to experiment how the model could be materialized. Brainstorming elicitation and in-depth interview were employed to reflect on how each element could be integrated. The result of this study offered an innovation on how food tourism could be profoundly interpreted and how tourism development could enhance value creation for agricultural based community. The outcomes of the research present co-creative multi-stakeholder model and the value creation method through the whole supply chain of Thai gastronomy. The findings have been eventually incorporated into ‘gastro-diplomacy’ strategy for Thai tourism.Keywords: community-based tourism, gastro-diplomacy, gastronomy tourism, sustainable tourism development
Procedia PDF Downloads 30811343 Engineering Topology of Construction Ecology in Urban Environments: Suez Canal Economic Zone
Authors: Moustafa Osman Mohammed
Abstract:
Integration sustainability outcomes give attention to construction ecology in the design review of urban environments to comply with Earth’s System that is composed of integral parts of the (i.e., physical, chemical and biological components). Naturally, exchange patterns of industrial ecology have consistent and periodic cycles to preserve energy flows and materials in Earth’s System. When engineering topology is affecting internal and external processes in system networks, it postulated the valence of the first-level spatial outcome (i.e., project compatibility success). These instrumentalities are dependent on relating the second-level outcome (i.e., participant security satisfaction). Construction ecology approach feedback energy from resources flows between biotic and abiotic in the entire Earth’s ecosystems. These spatial outcomes are providing an innovation, as entails a wide range of interactions to state, regulate and feedback “topology” to flow as “interdisciplinary equilibrium” of ecosystems. The interrelation dynamics of ecosystems are performing a process in a certain location within an appropriate time for characterizing their unique structure in “equilibrium patterns”, such as biosphere and collecting a composite structure of many distributed feedback flows. These interdisciplinary systems regulate their dynamics within complex structures. These dynamic mechanisms of the ecosystem regulate physical and chemical properties to enable a gradual and prolonged incremental pattern to develop a stable structure. The engineering topology of construction ecology for integration sustainability outcomes offers an interesting tool for ecologists and engineers in the simulation paradigm as an initial form of development structure within compatible computer software. This approach argues from ecology, resource savings, static load design, financial other pragmatic reasons, while an artistic/architectural perspective, these are not decisive. The paper described an attempt to unify analytic and analogical spatial modeling in developing urban environments as a relational setting, using optimization software and applied as an example of integrated industrial ecology where the construction process is based on a topology optimization approach.Keywords: construction ecology, industrial ecology, urban topology, environmental planning
Procedia PDF Downloads 13011342 A Comparative Analysis Of Da’wah Methodology Applied by the Two Variant Factions of Jama’atu Izalatil Bid’ah Wa-Iqamatis Sunnah in Nigeria
Authors: Aminu Alhaji Bala
Abstract:
The Jama’atu Izalatil Bid’ah Wa-Iqamatis Sunnah is a Da’wah organization and reform movement launched in Jos - Nigeria in 1978 as a purely reform movement under the leadership of late Shaykh Ismai’la Idris. The organization started a full fledge preaching sessions at National, State and Local Government levels immediately after its formation. The contributions of this organization to da'wah activities in Nigeria are paramount. The organization conducted its preaching under the council of preaching with the help of the executives, elders and patrons of the movement. Teaching and preaching have been recognized as the major programs of the society. Its preaching activities are conducted from ward, local, state and national levels throughout the states of Nigeria and beyond. It also engaged itself in establishing Mosques, schools and offers sermons during Friday congregation and Eid days throughout its mosques where its sermon is translated into vernacular language, this attracted many Muslims who don’t understand Arabic to patronize the its activities. The organization however split into two faction due to different approaches to Da’wah methodology and some seemingly selfish interests among its leaders. It is upon this background that this research was conducted using analytical method to compare and contrast the da’wah methodology applied by the two factions of the organization. The research discussed about the formation, Da’wah activities of the organization. It also compared and contrast the Da’wah approach and methodology of the two factions. The research finding reveals that different approach and methods applied by these factions is one of the main reason of their split in addition to other selfish interest among its leaders.Keywords: activities, Da’wah, methodology, organization
Procedia PDF Downloads 22311341 3D Object Retrieval Based on Similarity Calculation in 3D Computer Aided Design Systems
Authors: Ahmed Fradi
Abstract:
Nowadays, recent technological advances in the acquisition, modeling, and processing of three-dimensional (3D) objects data lead to the creation of models stored in huge databases, which are used in various domains such as computer vision, augmented reality, game industry, medicine, CAD (Computer-aided design), 3D printing etc. On the other hand, the industry is currently benefiting from powerful modeling tools enabling designers to easily and quickly produce 3D models. The great ease of acquisition and modeling of 3D objects make possible to create large 3D models databases, then, it becomes difficult to navigate them. Therefore, the indexing of 3D objects appears as a necessary and promising solution to manage this type of data, to extract model information, retrieve an existing model or calculate similarity between 3D objects. The objective of the proposed research is to develop a framework allowing easy and fast access to 3D objects in a CAD models database with specific indexing algorithm to find objects similar to a reference model. Our main objectives are to study existing methods of similarity calculation of 3D objects (essentially shape-based methods) by specifying the characteristics of each method as well as the difference between them, and then we will propose a new approach for indexing and comparing 3D models, which is suitable for our case study and which is based on some previously studied methods. Our proposed approach is finally illustrated by an implementation, and evaluated in a professional context.Keywords: CAD, 3D object retrieval, shape based retrieval, similarity calculation
Procedia PDF Downloads 26211340 Rank-Based Chain-Mode Ensemble for Binary Classification
Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu
Abstract:
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble
Procedia PDF Downloads 13811339 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon
Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi
Abstract:
Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning
Procedia PDF Downloads 15611338 Artificial Cells Capable of Communication by Using Polymer Hydrogel
Authors: Qi Liu, Jiqin Yao, Xiaohu Zhou, Bo Zheng
Abstract:
The first artificial cell was produced by Thomas Chang in the 1950s when he was trying to make a mimic of red blood cells. Since then, many different types of artificial cells have been constructed from one of the two approaches: a so-called bottom-up approach, which aims to create a cell from scratch, and a top-down approach, in which genes are sequentially knocked out from organisms until only the minimal genome required for sustaining life remains. In this project, bottom-up approach was used to build a new cell-free expression system which mimics artificial cell that capable of protein expression and communicate with each other. The artificial cells constructed from the bottom-up approach are usually lipid vesicles, polymersomes, hydrogels or aqueous droplets containing the nucleic acids and transcription-translation machinery. However, lipid vesicles based artificial cells capable of communication present several issues in the cell communication research: (1) The lipid vesicles normally lose the important functions such as protein expression within a few hours. (2) The lipid membrane allows the permeation of only small molecules and limits the types of molecules that can be sensed and released to the surrounding environment for chemical communication; (3) The lipid vesicles are prone to rupture due to the imbalance of the osmotic pressure. To address these issues, the hydrogel-based artificial cells were constructed in this work. To construct the artificial cell, polyacrylamide hydrogel was functionalized with Acrylate PEG Succinimidyl Carboxymethyl Ester (ACLT-PEG2000-SCM) moiety on the polymer backbone. The proteinaceous factors can then be immobilized on the polymer backbone by the reaction between primary amines of proteins and N-hydroxysuccinimide esters (NHS esters) of ACLT-PEG2000-SCM, the plasmid template and ribosome were encapsulated inside the hydrogel particles. Because the artificial cell could continuously express protein with the supply of nutrients and energy, the artificial cell-artificial cell communication and artificial cell-natural cell communication could be achieved by combining the artificial cell vector with designed plasmids. The plasmids were designed referring to the quorum sensing (QS) system of bacteria, which largely relied on cognate acyl-homoserine lactone (AHL) / transcription pairs. In one communication pair, “sender” is the artificial cell or natural cell that can produce AHL signal molecule by synthesizing the corresponding signal synthase that catalyzed the conversion of S-adenosyl-L-methionine (SAM) into AHL, while the “receiver” is the artificial cell or natural cell that can sense the quorum sensing signaling molecule form “sender” and in turn express the gene of interest. In the experiment, GFP was first immobilized inside the hydrogel particle to prove that the functionalized hydrogel particles could be used for protein binding. After that, the successful communication between artificial cell-artificial cell and artificial cell-natural cell was demonstrated, the successful signal between artificial cell-artificial cell or artificial cell-natural cell could be observed by recording the fluorescence signal increase. The hydrogel-based artificial cell designed in this work can help to study the complex communication system in bacteria, it can also be further developed for therapeutic applications.Keywords: artificial cell, cell-free system, gene circuit, synthetic biology
Procedia PDF Downloads 15211337 Measuring the Extent of Equalization in Fiscal Transfers in India: An Index-Based Approach
Authors: Ragini Trehan, D.K. Srivastava
Abstract:
In the post-planning era, India’s fiscal transfers from the central to state governments are solely determined by the Finance Commissions (FCs). While in some of the well-established federations such as Australia, Canada, and Germany, equalization serves as the guiding principle of fiscal transfers and is constitutionally mandated, in India, it is not explicitly mandated, and FCs attempt to implement it indirectly by a combination of a formula-based share in the divisible pool of central taxes supplemented by a set of grants. In this context, it is important to measure the extent of equalization that is achieved through FC transfers with a view to improving the design of such transfers. This study uses an index-based methodology for measuring the degree of equalization achieved through FC-transfers covering the period from FC12 to the first year of FC15 spanning from 2005-06 to 2020-21. The ‘Index of Equalization’ shows that the extent of equalization has remained low in the range of 30% to 37% for the four Commission periods under review. The highest degree of equalization at 36.7% was witnessed in the FC12 period and the lowest equalization at 29.5% was achieved during the FC15(1) period. The equalizing efficiency of recommended transfers also shows a consistent fall from 11.4% in the FC12 period to 7.5% by the FC15 (1) period. Further, considering progressivity in fiscal transfers as a special case of equalizing transfers, this study shows that the scheme of per capita total transfers when determined using the equalization approach is more progressive and is characterized by minimal deviations as compared to the profile of transfers recommended by recent FCs.Keywords: fiscal transfers, index of equalization, equalizing efficiency, fiscal capacity, expenditure needs, finance Commission, tax effort
Procedia PDF Downloads 7411336 Systems Approach on Thermal Analysis of an Automatic Transmission
Authors: Sinsze Koo, Benjin Luo, Matthew Henry
Abstract:
In order to increase the performance of an automatic transmission, the automatic transmission fluid is required to be warm up to an optimal operating temperature. In a conventional vehicle, cold starts result in friction loss occurring in the gear box and engine. The stop and go nature of city driving dramatically affect the warm-up of engine oil and automatic transmission fluid and delay the time frame needed to reach an optimal operating temperature. This temperature phenomenon impacts both engine and transmission performance but also increases fuel consumption and CO2 emission. The aim of this study is to develop know-how of the thermal behavior in order to identify thermal impacts and functional principles in automatic transmissions. Thermal behavior was studied using models and simulations, developed using GT-Suit, on a one-dimensional thermal and flow transport. A power train of a conventional vehicle was modeled in order to emphasis the thermal phenomena occurring in the various components and how they impact the automatic transmission performance. The simulation demonstrates the thermal model of a transmission fluid cooling system and its component parts in warm-up after a cold start. The result of these analyses will support the future designs of transmission systems and components in an attempt to obtain better fuel efficiency and transmission performance. Therefore, these thermal analyses could possibly identify ways that improve existing thermal management techniques with prioritization on fuel efficiency.Keywords: thermal management, automatic transmission, hybrid, and systematic approach
Procedia PDF Downloads 37711335 Design and Optimization of Open Loop Supply Chain Distribution Network Using Hybrid K-Means Cluster Based Heuristic Algorithm
Authors: P. Suresh, K. Gunasekaran, R. Thanigaivelan
Abstract:
Radio frequency identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. It is also expected to improve the consumer shopping experience by making it more likely that the products they want to purchase are available. Recent announcements from some key retailers have brought interest in RFID to the forefront. A modified K- Means Cluster based Heuristic approach, Hybrid Genetic Algorithm (GA) - Simulated Annealing (SA) approach, Hybrid K-Means Cluster based Heuristic-GA and Hybrid K-Means Cluster based Heuristic-GA-SA for Open Loop Supply Chain Network problem are proposed. The study incorporated uniform crossover operator and combined crossover operator in GAs for solving open loop supply chain distribution network problem. The algorithms are tested on 50 randomly generated data set and compared with each other. The results of the numerical experiments show that the Hybrid K-means cluster based heuristic-GA-SA, when tested on 50 randomly generated data set, shows superior performance to the other methods for solving the open loop supply chain distribution network problem.Keywords: RFID, supply chain distribution network, open loop supply chain, genetic algorithm, simulated annealing
Procedia PDF Downloads 16511334 Effects of Fe Addition and Process Parameters on the Wear and Corrosion Characteristics of Icosahedral Al-Cu-Fe Coatings on Ti-6Al-4V Alloy
Authors: Olawale S. Fatoba, Stephen A. Akinlabi, Esther T. Akinlabi, Rezvan Gharehbaghi
Abstract:
The performance of material surface under wear and corrosion environments cannot be fulfilled by the conventional surface modifications and coatings. Therefore, different industrial sectors need an alternative technique for enhanced surface properties. Titanium and its alloys possess poor tribological properties which limit their use in certain industries. This paper focuses on the effect of hybrid coatings Al-Cu-Fe on a grade five titanium alloy using laser metal deposition (LMD) process. Icosahedral Al-Cu-Fe as quasicrystals is a relatively new class of materials which exhibit unusual atomic structure and useful physical and chemical properties. A 3kW continuous wave ytterbium laser system (YLS) attached to a KUKA robot which controls the movement of the cladding process was utilized for the fabrication of the coatings. The titanium cladded surfaces were investigated for its hardness, corrosion and tribological behaviour at different laser processing conditions. The samples were cut to corrosion coupons, and immersed into 3.65% NaCl solution at 28oC using Electrochemical Impedance Spectroscopy (EIS) and Linear Polarization (LP) techniques. The cross-sectional view of the samples was analysed. It was found that the geometrical properties of the deposits such as width, height and the Heat Affected Zone (HAZ) of each sample remarkably increased with increasing laser power due to the laser-material interaction. It was observed that there are higher number of aluminum and titanium presented in the formation of the composite. The indentation testing reveals that for both scanning speed of 0.8 m/min and 1m/min, the mean hardness value decreases with increasing laser power. The low coefficient of friction, excellent wear resistance and high microhardness were attributed to the formation of hard intermetallic compounds (TiCu, Ti2Cu, Ti3Al, Al3Ti) produced through the in situ metallurgical reactions during the LMD process. The load-bearing capability of the substrate was improved due to the excellent wear resistance of the coatings. The cladded layer showed a uniform crack free surface due to optimized laser process parameters which led to the refinement of the coatings.Keywords: Al-Cu-Fe coating, corrosion, intermetallics, laser metal deposition, Ti-6Al-4V alloy, wear resistance
Procedia PDF Downloads 17811333 An Assessment of the Role of Actors in the Medical Waste Management Policy-Making Process of Bangladesh
Authors: Md Monirul Islam, Shahaduz Zaman, Mosarraf H. Sarker
Abstract:
Context: Medical waste management (MWM) is a critical sector in Bangladesh due to its impact on human health and the environment. There is a need to assess the current policies and identify the role of policy actors in the policy formulation and implementation process. Research Aim: The study aimed to evaluate the role of policy actors in the medical waste management policy-making process in Bangladesh, identify policy gaps, and provide actionable recommendations for improvement. Methodology: The study adopted a qualitative research method and conducted key informant interviews. The data collected were analyzed using the thematic coding approach through Atlas.ti software. Findings: The study found that policies are formulated at higher administrative levels and implemented in a top-down approach. Higher-level institutions predominantly contribute to policy development, while lower-level institutions focus on implementation. However, due to negligence, ignorance, and lack of coordination, medical waste management receives insufficient attention from the actors. The study recommends the need for immediate strategies, a comprehensive action plan, regular policy updates, and inter-ministerial meetings to enhance medical waste management practices and interventions. Theoretical Importance: The research contributes to evaluating the role of policy actors in medical waste management policymaking and implementation in Bangladesh. It identifies policy gaps and provides actionable recommendations for improvement. Data Collection: The study used key informant interviews as the data collection method. Thirty-six participants were interviewed, including influential policymakers and representatives of various administrative spheres. Analysis Procedures: The data collected was analyzed using the inductive thematic analysis approach. Question Addressed: The study aimed to assess the role of policy actors in medical waste management policymaking and implementation in Bangladesh. Conclusion: In conclusion, the study provides insights into the current medical waste management policy in Bangladesh, the role of policy actors in policy formulation and implementation, and the need for improved strategies and policy updates. The findings of this study can guide future policy-making efforts to enhance medical waste management practices and interventions in Bangladesh.Keywords: key informant, medical waste management, policy maker, qualitative study
Procedia PDF Downloads 8111332 Characterization of Articular Cartilage Based on the Response of Cartilage Surface to Loading/Unloading
Authors: Z. Arabshahi, I. Afara, A. Oloyede, H. Moody, J. Kashani, T. Klein
Abstract:
Articular cartilage is a fluid-swollen tissue of synovial joints that functions by providing a lubricated surface for articulation and to facilitate the load transmission. The biomechanical function of this tissue is highly dependent on the integrity of its ultrastructural matrix. Any alteration of articular cartilage matrix, either by injury or degenerative conditions such as osteoarthritis (OA), compromises its functional behaviour. Therefore, the assessment of articular cartilage is important in early stages of degenerative process to prevent or reduce further joint damage with associated socio-economic impact. Therefore, there has been increasing research interest into the functional assessment of articular cartilage. This study developed a characterization parameter for articular cartilage assessment based on the response of cartilage surface to loading/unloading. This is because the response of articular cartilage to compressive loading is significantly depth-dependent, where the superficial zone and underlying matrix respond differently to deformation. In addition, the alteration of cartilage matrix in the early stages of degeneration is often characterized by PG loss in the superficial layer. In this study, it is hypothesized that the response of superficial layer is different in normal and proteoglycan depleted tissue. To establish the viability of this hypothesis, samples of visually intact and artificially proteoglycan-depleted bovine cartilage were subjected to compression at a constant rate to 30 percent strain using a ring-shaped indenter with an integrated ultrasound probe and then unloaded. The response of articular surface which was indirectly loaded was monitored using ultrasound during the time of loading/unloading (deformation/recovery). It was observed that the rate of cartilage surface response to loading/unloading was different for normal and PG-depleted cartilage samples. Principal Component Analysis was performed to identify the capability of the cartilage surface response to loading/unloading, to distinguish between normal and artificially degenerated cartilage samples. The classification analysis of this parameter showed an overlap between normal and degenerated samples during loading. While there was a clear distinction between normal and degenerated samples during unloading. This study showed that the cartilage surface response to loading/unloading has the potential to be used as a parameter for cartilage assessment.Keywords: cartilage integrity parameter, cartilage deformation/recovery, cartilage functional assessment, ultrasound
Procedia PDF Downloads 19211331 Bio-Remediation of Lead-Contaminated Water Using Adsorbent Derived from Papaya Peel
Authors: Sahar Abbaszadeh, Sharifah Rafidah Wan Alwi, Colin Webb, Nahid Ghasemi, Ida Idayu Muhamad
Abstract:
Toxic heavy metal discharges into environment due to rapid industrialization is a serious pollution problem that has drawn global attention towards their adverse impacts on both the structure of ecological systems as well as human health. Lead as toxic and bio-accumulating elements through the food chain, is regularly entering to water bodies from discharges of industries such as plating, mining activities, battery manufacture, paint manufacture, etc. The application of conventional methods to degrease and remove Pb(II) ion from wastewater is often restricted due to technical and economic constrains. Therefore, the use of various agro-wastes as low-cost bioadsorbent is found to be attractive since they are abundantly available and cheap. In this study, activated carbon of papaya peel (AC-PP) (as locally available agricultural waste) was employed to evaluate its Pb(II) uptake capacity from single-solute solutions in sets of batch mode experiments. To assess the surface characteristics of the adsorbents, the scanning electron microscope (SEM) coupled with energy disperse X-ray (EDX), and Fourier transform infrared spectroscopy (FT-IR) analysis were utilized. The removal amount of Pb(II) was determined by atomic adsorption spectrometry (AAS). The effects of pH, contact time, the initial concentration of Pb(II) and adsorbent dosage were investigated. The pH value = 5 was observed as optimum solution pH. The optimum initial concentration of Pb(II) in the solution for AC-PP was found to be 200 mg/l where the amount of Pb(II) removed was 36.42 mg/g. At the agitating time of 2 h, the adsorption processes using 100 mg dosage of AC-PP reached equilibrium. The experimental results exhibit high capability and metal affinity of modified papaya peel waste with removal efficiency of 93.22 %. The evaluation results show that the equilibrium adsorption of Pb(II) was best expressed by Freundlich isotherm model (R2 > 0.93). The experimental results confirmed that AC-PP potentially can be employed as an alternative adsorbent for Pb(II) uptake from industrial wastewater for the design of an environmentally friendly yet economical wastewater treatment process.Keywords: activated carbon, bioadsorption, lead removal, papaya peel, wastewater treatment
Procedia PDF Downloads 28611330 Numerical Simulation of Lifeboat Launching Using Overset Meshing
Authors: Alok Khaware, Vinay Kumar Gupta, Jean Noel Pederzani
Abstract:
Lifeboat launching from marine vessel or offshore platform is one of the important areas of research in offshore applications. With the advancement of computational fluid dynamic simulation (CFD) technology to solve fluid induced motions coupled with Six Degree of Freedom (6DOF), rigid body dynamics solver, it is now possible to predict the motion of the lifeboat precisely in different challenging conditions. Traditionally dynamic remeshing approach is used to solve this kind of problems, but remeshing approach has some bottlenecks to control good quality mesh in transient moving mesh cases. In the present study, an overset method with higher-order interpolation is used to simulate a lifeboat launched from an offshore platform into calm water, and volume of fluid (VOF) method is used to track free surface. Overset mesh consists of a set of overlapping component meshes, which allows complex geometries to be meshed with lesser effort. Good quality mesh with local refinement is generated at the beginning of the simulation and stay unchanged throughout the simulation. Overset mesh accuracy depends on the precise interpolation technique; the present study includes a robust and accurate least square interpolation method and results obtained with overset mesh shows good agreement with experiment.Keywords: computational fluid dynamics, free surface flow, lifeboat launching, overset mesh, volume of fluid
Procedia PDF Downloads 27711329 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis
Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby
Abstract:
The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.Keywords: component analysis, multivariate, slaughter, regression
Procedia PDF Downloads 16511328 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO
Procedia PDF Downloads 41911327 Gas Network Noncooperative Game
Authors: Teresa Azevedo PerdicoúLis, Paulo Lopes Dos Santos
Abstract:
The conceptualisation of the problem of network optimisation as a noncooperative game sets up a holistic interactive approach that brings together different network features (e.g., com-pressor stations, sources, and pipelines, in the gas context) where the optimisation objectives are different, and a single optimisation procedure becomes possible without having to feed results from diverse software packages into each other. A mathematical model of this type, where independent entities take action, offers the ideal modularity and subsequent problem decomposition in view to design a decentralised algorithm to optimise the operation and management of the network. In a game framework, compressor stations and sources are under-stood as players which communicate through network connectivity constraints–the pipeline model. That is, in a scheme similar to tatonnementˆ, the players appoint their best settings and then interact to check for network feasibility. The devolved degree of network unfeasibility informs the players about the ’quality’ of their settings, and this two-phase iterative scheme is repeated until a global optimum is obtained. Due to network transients, its optimisation needs to be assessed at different points of the control interval. For this reason, the proposed approach to optimisation has two stages: (i) the first stage computes along the period of optimisation in order to fulfil the requirement just mentioned; (ii) the second stage is initialised with the solution found by the problem computed at the first stage, and computes in the end of the period of optimisation to rectify the solution found at the first stage. The liability of the proposed scheme is proven correct on an abstract prototype and three example networks.Keywords: connectivity matrix, gas network optimisation, large-scale, noncooperative game, system decomposition
Procedia PDF Downloads 15211326 The Role of the University of Zululand in Documenting and Disseminating Indigenous Knowledge, in KwaZulu-Natal, South Africa
Authors: Smiso Buthelezi, Petros Dlamini, Dennis Ocholla
Abstract:
The study assesses the University of Zululand's practices for documenting, sharing, and accessing indigenous knowledge. Two research objectives guided it: to determine how indigenous knowledge (IK) is developed at the University of Zululand and how indigenous knowledge (IK) is documented at the University of Zululand. The study adopted both interpretive and positivist research paradigms. Ultimately, qualitative and quantitative research methods were used. The qualitative research approach collected data from academic and non-academic staff members. Interviews were conducted with 18 academic staff members and 5 with support staff members. The quantitative research approach was used to collect data from indigenous knowledge (IK) theses and dissertations from the University of Zululand Institutional Repository between 2009-2019. The study results revealed that many departments across the University of Zululand were involved in creating indigenous knowledge (IK)-related content. The department of African Languages was noted to be more involved in creating IK-related content. Moreover, the documentation of the content related to indigenous knowledge (IK) at the University of Zululand is done frequently but is not readily known. It was found that the creation and documentation of indigenous knowledge by different departments faced several challenges. The common challenges are a lack of interest among indigenous knowledge (IK) owners in sharing their knowledge, the local language as a barrier, and a shortage of proper tools for recording and capturing indigenous knowledge (IK). One of the study recommendations is the need for an indigenous knowledge systems (IKS) policy to be in place at the University of Zululand.Keywords: knowledge creation, SECI model, information and communication technology., indigenous knowledge
Procedia PDF Downloads 11311325 Video Compression Using Contourlet Transform
Authors: Delara Kazempour, Mashallah Abasi Dezfuli, Reza Javidan
Abstract:
Video compression used for channels with limited bandwidth and storage devices has limited storage capabilities. One of the most popular approaches in video compression is the usage of different transforms. Discrete cosine transform is one of the video compression methods that have some problems such as blocking, noising and high distortion inappropriate effect in compression ratio. wavelet transform is another approach is better than cosine transforms in balancing of compression and quality but the recognizing of curve curvature is so limit. Because of the importance of the compression and problems of the cosine and wavelet transforms, the contourlet transform is most popular in video compression. In the new proposed method, we used contourlet transform in video image compression. Contourlet transform can save details of the image better than the previous transforms because this transform is multi-scale and oriented. This transform can recognize discontinuity such as edges. In this approach we lost data less than previous approaches. Contourlet transform finds discrete space structure. This transform is useful for represented of two dimension smooth images. This transform, produces compressed images with high compression ratio along with texture and edge preservation. Finally, the results show that the majority of the images, the parameters of the mean square error and maximum signal-to-noise ratio of the new method based contourlet transform compared to wavelet transform are improved but in most of the images, the parameters of the mean square error and maximum signal-to-noise ratio in the cosine transform is better than the method based on contourlet transform.Keywords: video compression, contourlet transform, discrete cosine transform, wavelet transform
Procedia PDF Downloads 44411324 Disaster Management Approach for Planning an Early Response to Earthquakes in Urban Areas
Authors: Luis Reynaldo Mota-Santiago, Angélica Lozano
Abstract:
Determining appropriate measures to face earthquakesarea challenge for practitioners. In the literature, some analyses consider disaster scenarios, disregarding some important field characteristics. Sometimes, software that allows estimating the number of victims and infrastructure damages is used. Other times historical information of previous events is used, or the scenarios’informationis assumed to be available even if it isnot usual in practice. Humanitarian operations start immediately after an earthquake strikes, and the first hours in relief efforts are important; local efforts are critical to assess the situation and deliver relief supplies to the victims. A preparation action is prepositioning stockpiles, most of them at central warehouses placed away from damage-prone areas, which requires large size facilities and budget. Usually, decisions in the first 12 hours (standard relief time (SRT)) after the disaster are the location of temporary depots and the design of distribution paths. The motivation for this research was the delay in the reaction time of the early relief efforts generating the late arrival of aid to some areas after the Mexico City 7.1 magnitude earthquake in 2017. Hence, a preparation approach for planning the immediate response to earthquake disasters is proposed, intended for local governments, considering their capabilities for planning and for responding during the SRT, in order to reduce the start-up time of immediate response operations in urban areas. The first steps are the generation and analysis of disaster scenarios, which allow estimatethe relief demand before and in the early hours after an earthquake. The scenarios can be based on historical data and/or the seismic hazard analysis of an Atlas of Natural Hazards and Risk as a way to address the limited or null available information.The following steps include the decision processes for: a) locating local depots (places to prepositioning stockpiles)and aid-giving facilities at closer places as possible to risk areas; and b) designing the vehicle paths for aid distribution (from local depots to the aid-giving facilities), which can be used at the beginning of the response actions. This approach allows speeding up the delivery of aid in the early moments of the emergency, which could reduce the suffering of the victims allowing additional time to integrate a broader and more streamlined response (according to new information)from national and international organizations into these efforts. The proposed approachis applied to two case studies in Mexico City. These areas were affectedby the 2017’s earthquake, having limited aid response. The approach generates disaster scenarios in an easy way and plans a faster early response with a short quantity of stockpiles which can be managed in the early hours of the emergency by local governments. Considering long-term storage, the estimated quantities of stockpiles require a limited budget to maintain and a small storage space. These stockpiles are useful also to address a different kind of emergencies in the area.Keywords: disaster logistics, early response, generation of disaster scenarios, preparation phase
Procedia PDF Downloads 11011323 The Functional-Engineered Product-Service System Model: An Extensive Review towards a Unified Approach
Authors: Nicolas Haber
Abstract:
The study addresses the design process of integrated product-service offerings as a measure of answering environmental sustainability concerns by replacing stand-alone physical artefacts with comprehensive solutions relying on functional results rather than conventional product sales. However, views regarding this transformation are dissimilar and differentiated: The study discusses the importance and requirements of product-service systems before analysing the theoretical studies accomplished in the extent of their design and development processes. Based on this, a framework, built on a design science approach, is proposed, where the distinct approaches from the literature are merged towards a unified structure serving as a generic methodology to designing product-service systems. Each stage of this model is then developed to present a holistic design proposal called the Functional Engineered Product-Service System (FEPSS) model. Product-service systems are portrayed as customisable solutions tailored to specific settings and defined circumstances. Moreover, the approaches adopted to guide the design process are diversified. A thorough analysis of the design strategies and development processes however, allowed the extraction of a design backbone, valid to varied situations and contexts whether they are product-oriented, use-oriented or result-oriented. The goal is to guide manufacturers towards an eased adoption of these integrated offerings, given their inherited environmental benefits, by proposing a robust all-purpose design process.Keywords: functional product, integrated product-service offerings, product-service systems, sustainable design
Procedia PDF Downloads 29411322 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 40811321 Lagrangian Approach for Modeling Marine Litter Transport
Authors: Sarra Zaied, Arthur Bonpain, Pierre Yves Fravallo
Abstract:
The permanent supply of marine litter implies their accumulation in the oceans, which causes the presence of more compact wastes layers. Their Spatio-temporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment and the size and location of the wastes. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. For this, many research studies have been dedicated to describing the wastes behavior in order to identify their accumulation in oceans areas. Several models are therefore developed to understand the mechanisms that allow the accumulation and the displacements of marine litter. These models are able to accurately simulate the drift of wastes to study their behavior and stranding. However, these works aim to study the wastes behavior over a long period of time and not at the time of waste collection. This work investigates the transport of floating marine litter (FML) to provide basic information that can help in optimizing wastes collection by proposing a model for predicting their behavior during collection. The proposed study is based on a Lagrangian modeling approach that uses the main factors influencing the dynamics of the waste. The performance of the proposed method was assessed on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). Evaluation results in the Java Sea (Indonesia) prove that the proposed model can effectively predict the position and the velocity of marine wastes during collection.Keywords: floating marine litter, lagrangian transport, particle-tracking model, wastes drift
Procedia PDF Downloads 19111320 Experimental and Computational Investigations on the Mitigation of Air Pollutants Using Pulsed Radio Waves
Authors: Gangadhara Siva Naga Venkata Krishna Satya Narayana Swamy Undi
Abstract:
Particulate matter (PM) pollution in ambient air is a major environmental health risk factor contributing to disease and mortality worldwide. Current air pollution control methods have limitations in reducing real-world ambient PM levels. This study demonstrates the efficacy of using pulsed radio wave technology as a distinct approach to lower outdoor particulate pollution. Experimental data were compared with computational models to evaluate the efficiency of pulsed waves in coagulating and settling PM. Results showed 50%+ reductions in PM2.5 and PM10 concentrations at the city scale, with particle removal rates exceeding gravity settling by over 3X. Historical air quality data further validated the significant PM reductions achieved in test cases. Computational analyses revealed the underlying coagulation mechanisms induced by the pulsed waves, supporting the feasibility of this strategy for ambient particulate control. The pulsed electromagnetic technology displayed robustness in sustainably managing PM levels across diverse urban and industrial environments. Findings highlight the promise of this advanced approach as a next-generation solution to mitigate particulate air pollution and associated health burdens globally. The technology's scalability and energy efficiency can help address a key gap in current efforts to improve ambient air quality.Keywords: particulate matter, mitigation technologies, clean air, ambient air pollution
Procedia PDF Downloads 5111319 Digital Design and Practice of The Problem Based Learning in College of Medicine, Qassim University, Saudi Arabia
Authors: Ahmed Elzainy, Abir El Sadik, Waleed Al Abdulmonem, Ahmad Alamro, Homaidan Al-Homaidan
Abstract:
Problem-based learning (PBL) is an educational modality which stimulates critical and creative thinking. PBL has been practiced in the college of medicine, Qassim University, Saudi Arabia, since the 2002s with offline face to face activities. Therefore, crucial technological changes in paperless work were needed. The aim of the present study was to design and implement the digitalization of the PBL activities and to evaluate its impact on students' and tutors’ performance. This approach promoted the involvement of all stakeholders after their awareness of the techniques of using online tools. IT support, learning resources facilities, and required multimedia were prepared. Students’ and staff perception surveys reflected their satisfaction with these remarkable changes. The students were interested in the new digitalized materials and educational design, which facilitated the conduction of PBL sessions and provided sufficient time for discussion and peer sharing of knowledge. It enhanced the tutors for supervision and tracking students’ activities on the Learning Management System. It could be concluded that introducing of digitalization of the PBL activities promoted the students’ performance, engagement and enabled a better evaluation of PBL materials and getting prompt students as well as staff feedback. These positive findings encouraged the college to implement the digitalization approach in other educational activities, such as Team-Based Learning, as an additional opportunity for further development.Keywords: multimedia in PBL, online PBL, problem-based learning, PBL digitalization
Procedia PDF Downloads 120