Search results for: software process engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20599

Search results for: software process engineering

13609 Amazonian Native Biomass Residue for Sustainable Development of Isolated Communities

Authors: Bruna C. Brasileiro, José Alberto S. Sá, Brigida R. P. Rocha

Abstract:

The Amazon region development was related to large-scale projects associated with economic cycles. Economic cycles were originated from policies implemented by successive governments that exploited the resources and have not yet been able to improve the local population's quality of life. These implanted development strategies were based on vertical planning centered on State that didn’t know and showed no interest in know the local needs and potentialities. The future of this region is a challenge that depends on a model of development based on human progress associated to intelligent, selective and environmentally safe exploitation of natural resources settled in renewable and no-polluting energy generation sources – a differential factor of attraction of new investments in a context of global energy and environmental crisis. In this process the planning and support of Brazilian State, local government, and selective international partnership are essential. Residual biomass utilization allows the sustainable development by the integration of production chain and energy generation process which could improve employment condition and income of riversides. Therefore, this research discourses how the use of local residual biomass (açaí lumps) could be an important instrument of sustainable development for isolated communities located at Alcobaça Sustainable Development Reserve (SDR), Tucuruí, Pará State, since in this region the energy source more accessible for who can pay are the fossil fuels that reaches about 54% of final energy consumption by the integration between the açaí productive chain and the use of renewable energy source besides it can promote less environmental impact and decrease the use of fossil fuels and carbon dioxide emissions.

Keywords: Amazon, biomass, renewable energy, sustainability

Procedia PDF Downloads 296
13608 Knowledge, Attitude and Practice of Anemia among Females Attending Bolan Medical Complex Quetta, Balochistan

Authors: A. Abdullah, N. ul Haq, A. Nasim

Abstract:

Objectives: This study was aimed to assess the knowledge, attitude, and practice of anemia among females attending Bolan Medical Complex Quetta, Balochistan. Methods: A quantitative cross-sectional study by adopting a questionnaire containing 3 dimensions knowledge (15 questions), Attitude (5 questions), and Practice (4 questions) for the assessment of knowledge, attitude and practice of anemia among females was conducted. All females attending Bolan Medical Complex Quetta, Balochistan were approached for the study. Descriptive statistics were used to describe demographic and KAP related characteristics of the females regarding anemia.All data were analyzed by using SPSS (Statistical Package of Social Sciences) software program version 20.0. Results: Data was collected from six hundred and thirteen (613) participants. Majority of the respondents (n=180, 29.4%) were categorized in the age group of 29-33 years. Participants had knowledge regarding anemia was (n= 564, 91.9%), and attitude was (n= 516, 84.0%) whereas practice was (n=437, 71.3%). Multitative analysis revealed the negative correlation between Attitude-practice (P= -0.040) and a significant figure (0.001) was present between knowledge-attitude. Occupation and reason of diagnosis were not predictive of better KAP. Conclusions: Knowledge, attitude, and practice of Anemia shows a satisfactory response in this study. Furthermore, study finding implicates the need for health promotion among females. Improving nutritional knowledge and information related Anemia can result in better control and management.

Keywords: anemia, knowledge attitude and practice, females, college

Procedia PDF Downloads 182
13607 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 94
13606 Navigating the Assessment Landscape in English Language Teaching: Strategies, Challengies and Best Practices

Authors: Saman Khairani

Abstract:

Assessment is a pivotal component of the teaching and learning process, serving as a critical tool for evaluating student progress, diagnosing learning needs, and informing instructional decisions. In the context of English Language Teaching (ELT), effective assessment practices are essential to promote meaningful learning experiences and foster continuous improvement in language proficiency. This paper delves into various assessment strategies, explores associated challenges, and highlights best practices for assessing student learning in ELT. The paper begins by examining the diverse forms of assessment, including formative assessments that provide timely feedback during the learning process and summative assessments that evaluate overall achievement. Additionally, alternative methods such as portfolios, self-assessment, and peer assessment play a significant role in capturing various aspects of language learning. Aligning assessments with learning objectives is crucial. Educators must ensure that assessment tasks reflect the desired language skills, communicative competence, and cultural awareness. Validity, reliability, and fairness are essential considerations in assessment design. Challenges in assessing language skills—such as speaking, listening, reading, and writing—are discussed, along with practical solutions. Constructive feedback, tailored to individual learners, guides their language development. In conclusion, this paper synthesizes research findings and practical insights, equipping ELT practitioners with the knowledge and tools necessary to design, implement, and evaluate effective assessment practices. By fostering meaningful learning experiences, educators contribute significantly to learners’ language proficiency and overall success.

Keywords: ELT, formative, summative, fairness, validity, reliability

Procedia PDF Downloads 42
13605 A Simple Chemical Approach to Regenerating Strength of Thermally Recycled Glass Fibre

Authors: Sairah Bashir, Liu Yang, John Liggat, James Thomason

Abstract:

Glass fibre is currently used as reinforcement in over 90% of all fibre-reinforced composites produced. The high rigidity and chemical resistance of these composites are required for optimum performance but unfortunately results in poor recyclability; when such materials are no longer fit for purpose, they are frequently deposited in landfill sites. Recycling technologies, for example, thermal treatment, can be employed to address this issue; temperatures typically between 450 and 600 °C are required to allow degradation of the rigid polymeric matrix and subsequent extraction of fibrous reinforcement. However, due to the severe thermal conditions utilised in the recycling procedure, glass fibres become too weak for reprocessing in second-life composite materials. In addition, more stringent legislation is being put in place regarding disposal of composite waste, and so it is becoming increasingly important to develop long-term recycling solutions for such materials. In particular, the development of a cost-effective method to regenerate strength of thermally recycled glass fibres will have a positive environmental effect as a reduced volume of composite material will be destined for landfill. This research study has demonstrated the positive impact of sodium hydroxide (NaOH) and potassium hydroxide (KOH) solution, prepared at relatively mild temperatures and at concentrations of 1.5 M and above, on the strength of heat-treated glass fibres. As a result, alkaline treatments can potentially be implemented to glass fibres that are recycled from composite waste to allow their reuse in second-life materials. The optimisation of the strength recovery process is being conducted by varying certain reaction parameters such as molarity of alkaline solution and treatment time. It is believed that deep V-shaped surface flaws exist commonly on severely damaged fibre surfaces and are effectively removed to form smooth, U-shaped structures following alkaline treatment. Although these surface flaws are believed to be present on glass fibres they have not in fact been observed, however, they have recently been discovered in this research investigation through analytical techniques such as AFM (atomic force microscopy) and SEM (scanning electron microscopy). Reaction conditions such as molarity of alkaline solution affect the degree of etching of the glass fibre surface, and therefore the extent to which fibre strength is recovered. A novel method in determining the etching rate of glass fibres after alkaline treatment has been developed, and the data acquired can be correlated with strength. By varying reaction conditions such as alkaline solution temperature and molarity, the activation energy of the glass etching process and the reaction order can be calculated respectively. The promising results obtained from NaOH and KOH treatments have opened an exciting route to strength regeneration of thermally recycled glass fibres, and the optimisation of the alkaline treatment process is being continued in order to produce recycled fibres with properties that match original glass fibre products. The reuse of such glass filaments indicates that closed-loop recycling of glass fibre reinforced composite (GFRC) waste can be achieved. In fact, the development of a closed-loop recycling process for GFRC waste is already underway in this research study.

Keywords: glass fibers, glass strengthening, glass structure and properties, surface reactions and corrosion

Procedia PDF Downloads 243
13604 The Application of Enzymes on Pharmaceutical Products and Process Development

Authors: Reginald Anyanwu

Abstract:

Enzymes are biological molecules that significantly regulate the rate of almost all of the chemical reactions that take place within cells, and have been widely used for products’ innovations. They are vital for life and serve a wide range of important functions in the body, such as aiding in digestion and metabolism. The present study was aimed at finding out the extent to which biological molecules have been utilized by pharmaceutical, food and beverage, and biofuel industries in commercial and scale up applications. Taking into account the escalating business opportunities in this vertical, biotech firms have also been penetrating enzymes industry especially that of food. The aim of the study therefore was to find out how biocatalysis can be successfully deployed; how enzyme application can improve industrial processes. To achieve the purpose of the study, the researcher focused on the analytical tools that are critical for the scale up implementation of enzyme immobilization to ascertain the extent of increased product yield at minimum logistical burden and maximum market profitability on the environment and user. The researcher collected data from four pharmaceutical companies located at Anambra state and Imo state of Nigeria. Questionnaire items were distributed to these companies. The researcher equally made a personal observation on the applicability of these biological molecules on innovative Products since there is now shifting trends toward the consumption of healthy and quality food. In conclusion, it was discovered that enzymes have been widely used for products’ innovations but there are however variations on their applications. It was also found out that pivotal contenders of enzymes market have lately been making heavy investments in the development of innovative product solutions. It was recommended that the applications of enzymes on innovative products should be widely practiced.

Keywords: enzymes, pharmaceuticals, process development, quality food consumption, scale-up applications

Procedia PDF Downloads 131
13603 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh

Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi

Abstract:

Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.

Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region

Procedia PDF Downloads 67
13602 Effect of Naphtha in Addition to a Cycle Steam Stimulation Process Reducing the Heavy Oil Viscosity Using a Two-Level Factorial Design

Authors: Nora A. Guerrero, Adan Leon, María I. Sandoval, Romel Perez, Samuel Munoz

Abstract:

The addition of solvents in cyclic steam stimulation is a technique that has shown an impact on the improved recovery of heavy oils. In this technique, it is possible to reduce the steam/oil ratio in the last stages of the process, at which time this ratio increases significantly. The mobility of improved crude oil increases due to the structural changes of its components, which at the same time reflected in the decrease in density and viscosity. In the present work, the effect of the variables such as temperature, time, and weight percentage of naphtha was evaluated, using a factorial design of experiments 23. From the results of analysis of variance (ANOVA) and Pareto diagram, it was possible to identify the effect on viscosity reduction. The experimental representation of the crude-vapor-naphtha interaction was carried out in a batch reactor on a Colombian heavy oil of 12.8° API and 3500 cP. The conditions of temperature, reaction time, and percentage of naphtha were 270-300 °C, 48-66 hours, and 3-9% by weight, respectively. The results showed a decrease in density with values in the range of 0.9542 to 0.9414 g/cm³, while the viscosity decrease was in the order of 55 to 70%. On the other hand, simulated distillation results, according to ASTM 7169, revealed significant conversions of the 315°C+ fraction. From the spectroscopic techniques of nuclear magnetic resonance NMR, infrared FTIR and UV-VIS visible ultraviolet, it was determined that the increase in the performance of the light fractions in the improved crude is due to the breakdown of alkyl chains. The methodology for cyclic steam injection with naphtha and laboratory-scale characterization can be considered as a practical tool in improved recovery processes.

Keywords: viscosity reduction, cyclic steam stimulation, factorial design, naphtha

Procedia PDF Downloads 157
13601 Study on the Seismic Response of Slope under Pulse-Like Ground Motion

Authors: Peter Antwi Buah, Yingbin Zhang, Jianxian He, Chenlin Xiang, Delali Atsu Y. Bakah

Abstract:

Near-fault ground motions with velocity pulses are considered to cause significant damage to structures or slopes compared to ordinary ground motions without velocity pulses. The double pulsed pulse-like ground motion is as well known to be stronger than the single pulse. This study has numerically justified this perspective by studying the dynamic response of a homogeneous rock slope subjected to four pulse-like and two non-pulse-like ground motions using the Fast Lagrangian Analysis of Continua in 3 Dimensions (FLAC3D) software. Two of the pulse-like ground motions just have a single pulse. The results show that near-fault ground motions with velocity pulses can cause a higher dynamic response than regular ground motions. The amplification of the peak ground acceleration (PGA) in horizontal direction increases with the increase of the slope elevation. The seismic response of the slope under double pulse ground motion is stronger than that of the single pulse ground motion. The PGV amplification factor under the effect of the non-pulse-like records is also smaller than those under the pulse-like records. The velocity pulse strengthens the earthquake damage to the slope, which results in producing a more strong dynamic response.

Keywords: velocity pulses, dynamic response, PGV magnification effect, elevation effect, double pulse

Procedia PDF Downloads 150
13600 A Model-Based Approach for Energy Performance Assessment of a Spherical Stationary Reflector/Tracking Absorber Solar Concentrator

Authors: Rosa Christodoulaki, Irene Koronaki, Panagiotis Tsekouras

Abstract:

The aim of this study is to analyze the energy performance of a spherical Stationary Reflector / Tracking Absorber (SRTA) solar concentrator. This type of collector consists of a segment of a spherical mirror placed in a stationary position facing the sun and a cylindrical absorber that tracks the sun by a simple pivoting motion about the center of curvature of the reflector. The energy analysis is performed through the development of a dynamic simulation model in TRNSYS software that calculates the annual heat production and the efficiency of the SRTA solar concentrator. The effect of solar concentrator design features and characteristics, such the reflector material, the reflector diameter, the receiver type, the solar radiation level and the concentration ratio, are discussed in details. Moreover, the energy performance curve of the SRTA solar concentrator, for various temperature differences between the mean fluid temperature and the ambient temperature and radiation intensities is drawn. The results are shown in diagrams, visualizing the effect of solar, optical and thermal parameters to the overall performance of the SRTA solar concentrator throughout the year. The analysis indicates that the SRTA solar concentrator can operate efficiently under a wide range of operating conditions.

Keywords: concentrating solar collector, energy analysis , stationary reflector, tracking absorber

Procedia PDF Downloads 188
13599 Application of Rapidly Exploring Random Tree Star-Smart and G2 Quintic Pythagorean Hodograph Curves to the UAV Path Planning Problem

Authors: Luiz G. Véras, Felipe L. Medeiros, Lamartine F. Guimarães

Abstract:

This work approaches the automatic planning of paths for Unmanned Aerial Vehicles (UAVs) through the application of the Rapidly Exploring Random Tree Star-Smart (RRT*-Smart) algorithm. RRT*-Smart is a sampling process of positions of a navigation environment through a tree-type graph. The algorithm consists of randomly expanding a tree from an initial position (root node) until one of its branches reaches the final position of the path to be planned. The algorithm ensures the planning of the shortest path, considering the number of iterations tending to infinity. When a new node is inserted into the tree, each neighbor node of the new node is connected to it, if and only if the extension of the path between the root node and that neighbor node, with this new connection, is less than the current extension of the path between those two nodes. RRT*-smart uses an intelligent sampling strategy to plan less extensive routes by spending a smaller number of iterations. This strategy is based on the creation of samples/nodes near to the convex vertices of the navigation environment obstacles. The planned paths are smoothed through the application of the method called quintic pythagorean hodograph curves. The smoothing process converts a route into a dynamically-viable one based on the kinematic constraints of the vehicle. This smoothing method models the hodograph components of a curve with polynomials that obey the Pythagorean Theorem. Its advantage is that the obtained structure allows computation of the curve length in an exact way, without the need for quadratural techniques for the resolution of integrals.

Keywords: path planning, path smoothing, Pythagorean hodograph curve, RRT*-Smart

Procedia PDF Downloads 159
13598 Impact of Map Generalization in Spatial Analysis

Authors: Lin Li, P. G. R. N. I. Pussella

Abstract:

When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.

Keywords: generalization, GIS, scales, spatial analysis

Procedia PDF Downloads 321
13597 An In-Depth Experimental Study of Wax Deposition in Pipelines

Authors: Arias M. L., D’Adamo J., Novosad M. N., Raffo P. A., Burbridge H. P., Artana G.

Abstract:

Shale oils are highly paraffinic and, consequently, can create wax deposits that foul pipelines during transportation. Several factors must be considered when designing pipelines or treatment programs that prevents wax deposition: including chemical species in crude oils, flowrates, pipes diameters and temperature. This paper describes the wax deposition study carried out within the framework of Y-TEC's flow assurance projects, as part of the process to achieve a better understanding on wax deposition issues. Laboratory experiments were performed on a medium size, 1 inch diameter, wax deposition loop of 15 mts long equipped with a solid detector system, online microscope to visualize crystals, temperature and pressure sensors along the loop pipe. A baseline test was performed with diesel with no paraffin or additive content. Tests were undertaken with different temperatures of circulating and cooling fluid at different flow conditions. Then, a solution formed with a paraffin added to the diesel was considered. Tests varying flowrate and cooling rate were again run. Viscosity, density, WAT (Wax Appearance Temperature) with DSC (Differential Scanning Calorimetry), pour point and cold finger measurements were carried out to determine physical properties of the working fluids. The results obtained in the loop were analyzed through momentum balance and heat transfer models. To determine possible paraffin deposition scenarios temperature and pressure loop output signals were studied. They were compared with WAT static laboratory methods. Finally, we scrutinized the effect of adding a chemical inhibitor to the working fluid on the dynamics of the process of wax deposition in the loop.

Keywords: paraffin desposition, flow assurance, chemical inhibitors, flow loop

Procedia PDF Downloads 92
13596 Enhancement of Material Removal Rate of Complex Featured Surfaces in Vibratory Finishing

Authors: Kunal Ahluwalia, Ampara Aramcharoen, Chan Wai Luen, Swee Hock Yeo

Abstract:

The different process engineering applications of vibratory finishing technology have led to its versatile use in the development of aviation components. The most noteworthy applications of vibratory finishing include deburring and imparting the required surface finish. In this paper, vibratory finishing has been used to study its effectiveness in removal of laser shock peened (LSP) layers from Titanium workpieces. A vibratory trough operating at a frequency of 25 Hz, amplitude 3.5 mm and titanium specimens (Ti-6Al-4V, Grade 5) of dimensions 50 x 50 x 10 mm³ were utilized for the experiments. A vibrating fixture operating at 200 Hz was used to provide vibration to the test piece and was immersed in the vibratory trough. It was evident that there is an increase in efficiency of removal of the complex featured layer and smoother surface finish with the introduction of the vibrating fixture in the vibratory finishing setup as compared to the conventional vibratory finishing setup wherein the fixture is not vibrating.

Keywords: laser shock peening, material removal, surface roughness, vibrating fixture, vibratory finishing

Procedia PDF Downloads 209
13595 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants

Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe

Abstract:

In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.

Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics

Procedia PDF Downloads 186
13594 Development of a Small-Group Teaching Method for Enhancing the Learning of Basic Acupuncture Manipulation Optimized with the Theory of Motor Learning

Authors: Wen-Chao Tang, Tang-Yi Liu, Ming Gao, Gang Xu, Hua-Yuan Yang

Abstract:

This study developed a method for teaching acupuncture manipulation in small groups optimized with the theory of motor learning. Sixty acupuncture students and their teacher participated in our research. Motion videos were recorded of their manipulations using the lifting-thrusting method. These videos were analyzed using Simi Motion software to acquire the movement parameters of the thumb tip. The parameter velocity curves along Y axis was used to generate small teaching groups clustered by a self-organized map (SOM) and K-means. Ten groups were generated. All the targeted instruction based on the comparative results groups as well as the videos of teacher and student was provided to the members of each group respectively. According to the theory and research of motor learning, the factors or technologies such as video instruction, observational learning, external focus and summary feedback were integrated into this teaching method. Such efforts were desired to improve and enhance the effectiveness of current acupuncture teaching methods in limited classroom teaching time and extracurricular training.

Keywords: acupuncture, group teaching, video instruction, observational learning, external focus, summary feedback

Procedia PDF Downloads 166
13593 Energy Production with Closed Methods

Authors: Bujar Ismaili, Bahti Ismajli, Venhar Ismaili, Skender Ramadani

Abstract:

In Kosovo, the problem with the electricity supply is huge and does not meet the demands of consumers. Older thermal power plants, which are regarded as big environmental polluters, produce most of the energy. Our experiment is based on the production of electricity using the closed method that does not affect environmental pollution by using waste as fuel that is considered to pollute the environment. The experiment was carried out in the village of Godanc, municipality of Shtime - Kosovo. In the experiment, a production line based on the production of electricity and central heating was designed at the same time. The results are the benefits of electricity as well as the release of temperature for heating with minimal expenses and with the release of 0% gases into the atmosphere. During this experiment, coal, plastic, waste from wood processing, and agricultural wastes were used as raw materials. The method utilized in the experiment allows for the release of gas through pipes and filters during the top-to-bottom combustion of the raw material in the boiler, followed by the method of gas filtration from waste wood processing (sawdust). During this process, the final product is obtained - gas, which passes through the carburetor, which enables the gas combustion process and puts into operation the internal combustion machine and the generator and produces electricity that does not release gases into the atmosphere. The obtained results show that the system provides energy stability without environmental pollution from toxic substances and waste, as well as with low production costs. From the final results, it follows that: in the case of using coal fuel, we have benefited from more electricity and higher temperature release, followed by plastic waste, which also gave good results. The results obtained during these experiments prove that the current problems of lack of electricity and heating can be met at a lower cost and have a clean environment and waste management.

Keywords: energy, heating, atmosphere, waste, gasification

Procedia PDF Downloads 222
13592 Data Structure Learning Platform to Aid in Higher Education IT Courses (DSLEP)

Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher

Abstract:

The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that cover from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.

Keywords: gamification, Interactive learning environment, data structures, e-learning

Procedia PDF Downloads 479
13591 Modeling of Hydraulic Networking of Water Supply Subsystem Case of Addis Ababa

Authors: Solomon Weldegebriel Gebrelibanos

Abstract:

Water is one of the most important substances in human life that can give a human liberality with its cost and availability. Water comes from rainfall and runoff and reaches the ground as runoff that is stored in a river, ponds, and big water bodies, including sea and ocean and the remaining water portion is infiltrated into the ground to store in the aquifer. Water can serve human beings in various ways, including irrigation, water supply, hydropower and soon. Water supply is the main pillar of the water service to the human being. Water supply distribution in Addis Ababa arises from Legedadi, Akakai, and Gefersa. The objective of the study is to measure the performance of the water supply distribution in Addis Ababa city. The water supply distribution model is developed by computer-aided design software. The model can analyze the operational change, loss of water, and performance of the network. The two design criteria that have been employed to analyze the network system are velocity and pressure. The result shows that the customers are using the water at high pressure with low demand. The water distribution system is older than the expected service life with more leakage. Hence the study recommended that fixing Pressure valves and new distribution facilities can resolve the performance of the water supply system

Keywords: distribution, model, pressure, velocity

Procedia PDF Downloads 113
13590 A Development of Portable Intrinsically Safe Explosion-Proof Type of Dual Gas Detector

Authors: Sangguk Ahn, Youngyu Kim, Jaheon Gu, Gyoutae Park

Abstract:

In this paper, we developed a dual gas leak instrument to detect Hydrocarbon (HC) and Monoxide (CO) gases. To two kinds of gases, it is necessary to design compact structure for sensors. And then it is important to draw sensing circuits such as measuring, amplifying and filtering. After that, it should be well programmed with robust, systematic and module coding methods. In center of them, improvement of accuracy and initial response time are a matter of vital importance. To manufacture distinguished gas leak detector, we applied intrinsically safe explosion-proof structure to lithium ion battery, main circuits, a pump with motor, color LCD interfaces and sensing circuits. On software, to enhance measuring accuracy we used numerical analysis such as Lagrange and Neville interpolation. Performance test result is conducted by using standard Methane with seven different concentrations with three other products. We want raise risk prevention and efficiency of gas safe management through distributing to the field of gas safety. Acknowledgment: This study was supported by Small and Medium Business Administration under the research theme of ‘Commercialized Development of a portable intrinsically safe explosion-proof type dual gas leak detector’, (task number S2456036).

Keywords: gas leak, dual gas detector, intrinsically safe, explosion proof

Procedia PDF Downloads 219
13589 Analysis of Inventory Control, Lot Size and Reorder Point for Engro Polymers and Chemicals

Authors: Ali Akber Jaffri, Asad Naseem, Javeria Khan

Abstract:

The purpose of this study is to determine safety stock, maximum inventory level, reordering point, and reordering quantity by rearranging lot sizes for supplier and customer in MRO (maintenance repair operations) warehouse of Engro Polymers & Chemicals. To achieve the aim, physical analysis method and excel commands were carried out to elicit the customer and supplier data provided by the company. Initially, we rearranged the current lot sizes and MOUs (measure of units) in SAP software. Due to change in lot sizes we have to determine the new quantities for safety stock, maximum inventory, reordering point and reordering quantity as per company's demand. By proposed system, we saved extra cost in terms of reducing the time of receiving from vendor and in issuance to customer, ease of material handling in MRO warehouse and also reduce human efforts. The information requirements identified in this study can be utilized in calculating Economic Order Quantity.

Keywords: carrying cost, economic order quantity, fast moving, lead time, lot size, MRO, maximum inventory, ordering cost, physical inspection, reorder point

Procedia PDF Downloads 228
13588 Risk Mitigation of Data Causality Analysis Requirements AI Act

Authors: Raphaël Weuts, Mykyta Petik, Anton Vedder

Abstract:

Artificial Intelligence has the potential to create and already creates enormous value in healthcare. Prescriptive systems might be able to make the use of healthcare capacity more efficient. Such systems might entail interpretations that exclude the effect of confounders that brings risks with it. Those risks might be mitigated by regulation that prevents systems entailing such risks to come to market. One modality of regulation is that of legislation, and the European AI Act is an example of such a regulatory instrument that might mitigate these risks. To assess the risk mitigation potential of the AI Act for those risks, this research focusses on a case study of a hypothetical application of medical device software that entails the aforementioned risks. The AI Act refers to the harmonised norms for already existing legislation, here being the European medical device regulation. The issue at hand is a causal link between a confounder and the value the algorithm optimises for by proxy. The research identifies where the AI Act already looks at confounders (i.a. feedback loops in systems that continue to learn after being placed on the market). The research identifies where the current proposal by parliament leaves legal uncertainty on the necessity to check for confounders that do not influence the input of the system, when the system does not continue to learn after being placed on the market. The authors propose an amendment to article 15 of the AI Act that would require high-risk systems to be developed in such a way as to mitigate risks from those aforementioned confounders.

Keywords: AI Act, healthcare, confounders, risks

Procedia PDF Downloads 247
13587 Impact of Machining Parameters on the Surface Roughness of Machined PU Block

Authors: Louis Denis Kevin Catherine, Raja Aziz Raja Ma’arof, Azrina Arshad, Sangeeth Suresh

Abstract:

Machining parameters are very important in determining the surface quality of any material. In the past decade, some new engineering materials were developed for the manufacturing industry which created a need to conduct an investigation on the impact of the said parameters on their surface roughness. The polyurethane (PU) block is widely used in the automotive industry to manufacture parts such as checking fixtures that are used to verify the dimensional accuracy of automotive parts. In this paper, the design of experiment (DOE) was used to investigate the effect of the milling parameters on the PU block. Furthermore, an analysis of the machined surface chemical composition was done using scanning electron microscope (SEM). It was found that the surface roughness of the PU block is severely affected when PU undergoes a flood machining process instead of a dry condition. In addition, the step over and the silicon content were found to be the most significant parameters that influence the surface quality of the PU block.

Keywords: polyurethane (PU), design of experiment (DOE), scanning electron microscope (SEM), surface roughness

Procedia PDF Downloads 508
13586 Lead Removal From Ex- Mining Pond Water by Electrocoagulation: Kinetics, Isotherm, and Dynamic Studies

Authors: Kalu Uka Orji, Nasiman Sapari, Khamaruzaman W. Yusof

Abstract:

Exposure of galena (PbS), tealite (PbSnS2), and other associated minerals during mining activities release lead (Pb) and other heavy metals into the mining water through oxidation and dissolution. Heavy metal pollution has become an environmental challenge. Lead, for instance, can cause toxic effects to human health, including brain damage. Ex-mining pond water was reported to contain lead as high as 69.46 mg/L. Conventional treatment does not easily remove lead from water. A promising and emerging treatment technology for lead removal is the application of the electrocoagulation (EC) process. However, some of the problems associated with EC are systematic reactor design, selection of maximum EC operating parameters, scale-up, among others. This study investigated an EC process for the removal of lead from synthetic ex-mining pond water using a batch reactor and Fe electrodes. The effects of various operating parameters on lead removal efficiency were examined. The results obtained indicated that the maximum removal efficiency of 98.6% was achieved at an initial PH of 9, the current density of 15mA/cm2, electrode spacing of 0.3cm, treatment time of 60 minutes, Liquid Motion of Magnetic Stirring (LM-MS), and electrode arrangement = BP-S. The above experimental data were further modeled and optimized using a 2-Level 4-Factor Full Factorial design, a Response Surface Methodology (RSM). The four factors optimized were the current density, electrode spacing, electrode arrangements, and Liquid Motion Driving Mode (LM). Based on the regression model and the analysis of variance (ANOVA) at 0.01%, the results showed that an increase in current density and LM-MS increased the removal efficiency while the reverse was the case for electrode spacing. The model predicted the optimal lead removal efficiency of 99.962% with an electrode spacing of 0.38 cm alongside others. Applying the predicted parameters, the lead removal efficiency of 100% was actualized. The electrode and energy consumptions were 0.192kg/m3 and 2.56 kWh/m3 respectively. Meanwhile, the adsorption kinetic studies indicated that the overall lead adsorption system belongs to the pseudo-second-order kinetic model. The adsorption dynamics were also random, spontaneous, and endothermic. The higher temperature of the process enhances adsorption capacity. Furthermore, the adsorption isotherm fitted the Freundlish model more than the Langmuir model; describing the adsorption on a heterogeneous surface and showed good adsorption efficiency by the Fe electrodes. Adsorption of Pb2+ onto the Fe electrodes was a complex reaction, involving more than one mechanism. The overall results proved that EC is an efficient technique for lead removal from synthetic mining pond water. The findings of this study would have application in the scale-up of EC reactor and in the design of water treatment plants for feed-water sources that contain lead using the electrocoagulation method.

Keywords: ex-mining water, electrocoagulation, lead, adsorption kinetics

Procedia PDF Downloads 138
13585 Numerical Analysis and Influence of the Parameters on Slope Stability

Authors: Fahim Kahlouche, Alaoua Bouaicha, Sihem Chaîbeddra, Sid-Ali Rafa, Abdelhamid Benouali

Abstract:

A designing of a structure requires its realization on rough or sloping ground. Besides the problem of the stability of the landslide, the behavior of the foundations that are bearing the structure is influenced by the destabilizing effect of the ground’s slope. This article focuses on the analysis of the slope stability exposed to loading by introducing the different factors influencing the slope’s behavior on the one hand, and on the influence of this slope on the foundation’s behavior on the other hand. This study is about the elastoplastic modelization using FLAC 2D. This software is based on the finite difference method, which is one of the older methods of numeric resolution of differential equations system with initial and boundary conditions. It was developed for the geotechnical simulation calculation. The aim of this simulation is to demonstrate the notable effect of shear modulus « G », cohesion « C », inclination angle (edge) « β », and distance between the foundation and the head of the slope on the stability of the slope as well as the stability of the foundation. In our simulation, the slope is constituted by homogenous ground. The foundation is considered as rigid/hard; therefore, the loading is made by the application of the vertical strengths on the nodes which represent the contact between the foundation and the ground. 

Keywords: slope, shallow foundation, numeric method, FLAC 2D

Procedia PDF Downloads 272
13584 Machinability Analysis in Drilling Flax Fiber-Reinforced Polylactic Acid Bio-Composite Laminates

Authors: Amirhossein Lotfi, Huaizhong Li, Dzung Viet Dao

Abstract:

Interest in natural fiber-reinforced composites (NFRC) is progressively growing both in terms of academia research and industrial applications thanks to their abundant advantages such as low cost, biodegradability, eco-friendly nature and relatively good mechanical properties. However, their widespread use is still presumed as challenging because of the specificity of their non-homogeneous structure, limited knowledge on their machinability characteristics and parameter settings, to avoid defects associated with the machining process. The present work is aimed to investigate the effect of the cutting tool geometry and material on the drilling-induced delamination, thrust force and hole quality produced when drilling a fully biodegradable flax/poly (lactic acid) composite laminate. Three drills with different geometries and material were used at different drilling conditions to evaluate the machinability of the fabricated composites. The experimental results indicated that the choice of cutting tool, in terms of material and geometry, has a noticeable influence on the cutting thrust force and subsequently drilling-induced damages. The lower value of thrust force and better hole quality was observed using high-speed steel (HSS) drill, whereas Carbide drill (with point angle of 130o) resulted in the highest value of thrust force. Carbide drill presented higher wear resistance and stability in variation of thrust force with a number of holes drilled, while HSS drill showed the lower value of thrust force during the drilling process. Finally, within the selected cutting range, the delamination damage increased noticeably with feed rate and moderately with spindle speed.

Keywords: natural fiber reinforced composites, delamination, thrust force, machinability

Procedia PDF Downloads 122
13583 Deasphalting of Crude Oil by Extraction Method

Authors: A. N. Kurbanova, G. K. Sugurbekova, N. K. Akhmetov

Abstract:

The asphaltenes are heavy fraction of crude oil. Asphaltenes on oilfield is known for its ability to plug wells, surface equipment and pores of the geologic formations. The present research is devoted to the deasphalting of crude oil as the initial stage refining oil. Solvent deasphalting was conducted by extraction with organic solvents (cyclohexane, carbon tetrachloride, chloroform). Analysis of availability of metals was conducted by ICP-MS and spectral feature at deasphalting was achieved by FTIR. High contents of asphaltenes in crude oil reduce the efficiency of refining processes. Moreover, high distribution heteroatoms (e.g., S, N) were also suggested in asphaltenes cause some problems: environmental pollution, corrosion and poisoning of the catalyst. The main objective of this work is to study the effect of deasphalting process crude oil to improve its properties and improving the efficiency of recycling processes. Experiments of solvent extraction are using organic solvents held in the crude oil JSC “Pavlodar Oil Chemistry Refinery. Experimental results show that deasphalting process also leads to decrease Ni, V in the composition of the oil. One solution to the problem of cleaning oils from metals, hydrogen sulfide and mercaptan is absorption with chemical reagents directly in oil residue and production due to the fact that asphalt and resinous substance degrade operational properties of oils and reduce the effectiveness of selective refining of oils. Deasphalting of crude oil is necessary to separate the light fraction from heavy metallic asphaltenes part of crude oil. For this oil is pretreated deasphalting, because asphaltenes tend to form coke or consume large quantities of hydrogen. Removing asphaltenes leads to partly demetallization, i.e. for removal of asphaltenes V/Ni and organic compounds with heteroatoms. Intramolecular complexes are relatively well researched on the example of porphyinous complex (VO2) and nickel (Ni). As a result of studies of V/Ni by ICP MS method were determined the effect of different solvents-deasphalting – on the process of extracting metals on deasphalting stage and select the best organic solvent. Thus, as the best DAO proved cyclohexane (C6H12), which as a result of ICP MS retrieves V-51.2%, Ni-66.4%? Also in this paper presents the results of a study of physical and chemical properties and spectral characteristics of oil on FTIR with a view to establishing its hydrocarbon composition. Obtained by using IR-spectroscopy method information about the specifics of the whole oil give provisional physical, chemical characteristics. They can be useful in the consideration of issues of origin and geochemical conditions of accumulation of oil, as well as some technological challenges. Systematic analysis carried out in this study; improve our understanding of the stability mechanism of asphaltenes. The role of deasphalted crude oil fractions on the stability asphaltene is described.

Keywords: asphaltenes, deasphalting, extraction, vanadium, nickel, metalloporphyrins, ICP-MS, IR spectroscopy

Procedia PDF Downloads 229
13582 Preference and Perspective for Gift Over-packaging Solution: A Case Study of Consumers in Shanghai, China

Authors: Heping Wang

Abstract:

Social interaction has increased as a result of rapid economic expansion. Particularly in China, gift exchanges have developed into a social tradition of showing gratitude. Most gifts, on the other hand, are lavishly presented or overpacked to impress or demonstrate respect to the gift receiver. Overpackaging wastes enormous resources and produces a lot of municipal solid waste (MSW), which can seriously harm the environment if it is not handled properly. The purpose of this study is to investigate consumers' perceptions, preferences, and perspectives regarding gifts overpackaging in order to identify potential solutions for reducing gifts overpackaging to achieve sustainable packaging objectives. The research was conducted by means of an online survey focusing on residents in Shanghai, China, and the data was quantitatively analyzed by SPSS software. According to research, consumers' perception of excessive packaging is approximately 3.5 points out of 5, and this perception has a significant impact on consumers' behavioral intentions; The preferences of givers and receivers for gift packaging are significantly different in three aspects; Customers prefer incentives for eco-packaging when it comes to measures to reduce gift overpackaging. Finally, the study also identifies suitable gift packaging options for customers.

Keywords: gift packaging, consumer perception, consumer preference, consumer perspective, overpackaging, solutions

Procedia PDF Downloads 55
13581 Developing a Cultural Policy Framework for Small Towns and Cities

Authors: Raymond Ndhlovu, Jen Snowball

Abstract:

It has long been known that the Cultural and Creative Industries (CCIs) have the potential to aid in physical, social and economic renewal and regeneration of towns and cities, hence their importance when dealing with regional development. The CCIs can act as a catalyst for activity and investment in an area because the ‘consumption’ of cultural activities will lead to the activities and use of other non-cultural activities, for example, hospitality development including restaurants and bars, as well as public transport. ‘Consumption’ of cultural activities also leads to employment creation, and diversification. However, CCIs tend to be clustered, especially around large cities. There is, moreover, a case for development of CCIs around smaller towns and cities, because they do not rely on high technology inputs, and long supply chains, and, their direct link to rural and isolated places makes them vital in regional development. However, there is currently little research on how to craft cultural policy for regions with smaller towns and cities. Using the Sarah Baartman District (SBDM) in South Africa as an example, this paper describes the process of developing cultural policy for a region that has potential, and existing, cultural clusters, but currently no one, coherent policy relating to CCI development. The SBDM was chosen as a case study because it has no large cities, but has some CCI clusters, and has identified them as potential drivers of local economic development. The process of developing cultural policy is discussed in stages: Identification of what resources are present; including human resources, soft and hard infrastructure; Identification of clusters; Analysis of CCI labour markets and ownership patterns; Opportunities and challenges from the point of view of CCIs and other key stakeholders; Alignment of regional policy aims with provincial and national policy objectives; and finally, design and implementation of a regional cultural policy.

Keywords: cultural and creative industries, economic impact, intrinsic value, regional development

Procedia PDF Downloads 220
13580 Detection of the Effectiveness of Training Courses and Their Limitations Using CIPP Model (Case Study: Isfahan Oil Refinery)

Authors: Neda Zamani

Abstract:

The present study aimed to investigate the effectiveness of training courses and their limitations using the CIPP model. The investigations were done on Isfahan Refinery as a case study. From a purpose point of view, the present paper is included among applied research and from a data gathering point of view, it is included among descriptive research of the field type survey. The population of the study included participants in training courses, their supervisors and experts of the training department. Probability-proportional-to-size (PPS) was used as the sampling method. The sample size for participants in training courses included 195 individuals, 30 supervisors and 11 individuals from the training experts’ group. To collect data, a questionnaire designed by the researcher and a semi-structured interview was used. The content validity of the data was confirmed by training management experts and the reliability was calculated through 0.92 Cronbach’s alpha. To analyze the data in descriptive statistics aspect (tables, frequency, frequency percentage and mean) were applied, and inferential statistics (Mann Whitney and Wilcoxon tests, Kruskal-Wallis test to determine the significance of the opinion of the groups) have been applied. Results of the study indicated that all groups, i.e., participants, supervisors and training experts, absolutely believe in the importance of training courses; however, participants in training courses regard content, teacher, atmosphere and facilities, training process, managing process and product as to be in a relatively appropriate level. The supervisors also regard output to be at a relatively appropriate level, but training experts regard content, teacher and managing processes as to be in an appropriate and higher than average level.

Keywords: training courses, limitations of training effectiveness, CIPP model, Isfahan oil refinery company

Procedia PDF Downloads 53