Search results for: computer terms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8696

Search results for: computer terms

7436 Comparison of Deep Brain Stimulation Targets in Parkinson's Disease: A Systematic Review

Authors: Hushyar Azari

Abstract:

Aim and background: Deep brain stimulation (DBS) is regarded as an important therapeutic choice for Parkinson's disease (PD). The two most common targets for DBS are the subthalamic nucleus (STN) and globus pallidus (GPi). This review was conducted to compare the clinical effectiveness of these two targets. Methods: A systematic literature search in electronic databases: Embase, Cochrane Library and PubMed were restricted to English language publications 2010 to 2021. Specified MeSH terms were searched in all databases. Studies which evaluated the Unified Parkinson's Disease Rating Scale (UPDRS) III were selected by meeting the following criteria: (1) compared both GPi and STN DBS; (2) had at least three months follow-up period; (3)at least five participants in each group; (4)conducted after 2010. Study quality assessment was performed using the Modified Jadad Scale. Results: 3577 potentially relevant articles were identified, of these, 3569 were excluded based on title and abstract, duplicate and unsuitable article removal. Eight articles satisfied the inclusion criteria and were scrutinized (458 PD patients). According to Modified Jadad Scale, the majority of included studies had low evidence quality which was a limitation of this review. 5 studies reported no statistically significant between-group difference for improvements in UPDRS ш scores. At the same time, there were some results in terms of pain, action tremor, rigidity, and urinary symptoms, which indicated that STN DBS might be a better choice. Regarding the adverse effects, GPi was superior. Conclusion: It is clear that other larger randomized clinical trials with longer follow-up periods and control groups are needed to decide which target is more efficient for deep brain stimulation in Parkinson’s disease and imposes fewer adverse effects on the patients. Meanwhile, STN seems more reasonable according to the results of this systematic review.

Keywords: brain stimulation, globus pallidus, Parkinson's disease, subthalamic nucleus

Procedia PDF Downloads 179
7435 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: automatic detection, defects, fracture lines, wavelets

Procedia PDF Downloads 248
7434 Information Requirements for Vessel Traffic Service Operations

Authors: Fan Li, Chun-Hsien Chen, Li Pheng Khoo

Abstract:

Operators of vessel traffic service (VTS) center provides three different types of services; namely information service, navigational assistance and traffic organization to vessels. To provide these services, operators monitor vessel traffic through computer interface and provide navigational advice based on the information integrated from multiple sources, including automatic identification system (AIS), radar system, and closed circuit television (CCTV) system. Therefore, this information is crucial in VTS operation. However, what information the VTS operator actually need to efficiently and properly offer services is unclear. The aim of this study is to investigate into information requirements for VTS operation. To achieve this aim, field observation was carried out to elicit the information requirements for VTS operation. The study revealed that the most frequent and important tasks were handling arrival vessel report, potential conflict control and abeam vessel report. Current location and vessel name were used in all tasks. Hazard cargo information was particularly required when operators handle arrival vessel report. The speed, the course, and the distance of two or several vessels were only used in potential conflict control. The information requirements identified in this study can be utilized in designing a human-computer interface that takes into consideration what and when information should be displayed, and might be further used to build the foundation of a decision support system for VTS.

Keywords: vessel traffic service, information requirements, hierarchy task analysis, field observation

Procedia PDF Downloads 250
7433 Hybrid Rocket Motor Performance Parameters: Theoretical and Experimental Evaluation

Authors: A. El-S. Makled, M. K. Al-Tamimi

Abstract:

A mathematical model to predict the performance parameters (thrusts, chamber pressures, fuel mass flow rates, mixture ratios, and regression rates during firing time) of hybrid rocket motor (HRM) is evaluated. The internal ballistic (IB) hybrid combustion model assumes that the solid fuel surface regression rate is controlled only by heat transfer (convective and radiative) from flame zone to solid fuel burning surface. A laboratory HRM is designed, manufactured, and tested for low thrust profile space missions (10-15 N) and for validating the mathematical model (computer program). The polymer material and gaseous oxidizer which are selected for this experimental work are polymethyle-methacrylate (PMMA) and polyethylene (PE) as solid fuel grain and gaseous oxygen (GO2) as oxidizer. The variation of various operational parameters with time is determined systematically and experimentally in firing of up to 20 seconds, and an average combustion efficiency of 95% of theory is achieved, which was the goal of these experiments. The comparison between recording fire data and predicting analytical parameters shows good agreement with the error that does not exceed 4.5% during all firing time. The current mathematical (computer) code can be used as a powerful tool for HRM analytical design parameters.

Keywords: hybrid combustion, internal ballistics, hybrid rocket motor, performance parameters

Procedia PDF Downloads 311
7432 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 115
7431 Robot Control by ERPs of Brain Waves

Authors: K. T. Sun, Y. H. Tai, H. W. Yang, H. T. Lin

Abstract:

This paper presented the technique of robot control by event-related potentials (ERPs) of brain waves. Based on the proposed technique, severe physical disabilities can free browse outside world. A specific component of ERPs, N2P3, was found and used to control the movement of robot and the view of camera on the designed brain-computer interface (BCI). Users only required watching the stimuli of attended button on the BCI, the evoked potentials of brain waves of the target button, N2P3, had the greatest amplitude among all control buttons. An experimental scene had been constructed that the robot required walking to a specific position and move the view of camera to see the instruction of the mission, and then completed the task. Twelve volunteers participated in this experiment, and experimental results showed that the correct rate of BCI control achieved 80% and the average of execution time was 353 seconds for completing the mission. Four main contributions included in this research: (1) find an efficient component of ERPs, N2P3, for BCI control, (2) embed robot's viewpoint image into user interface for robot control, (3) design an experimental scene and conduct the experiment, and (4) evaluate the performance of the proposed system for assessing the practicability.

Keywords: severe physical disabilities, robot control, event-related potentials (ERPs), brain-computer interface (BCI), brain waves

Procedia PDF Downloads 369
7430 Life Cycle Assessment of Mass Timber Structure, Construction Process as System Boundary

Authors: Mahboobeh Hemmati, Tahar Messadi, Hongmei Gu

Abstract:

Today, life cycle assessment (LCA) is a leading method in mitigating the environmental impacts emerging from the building sector. In this paper, LCA is used to quantify the Green House Gas (GHG) emissions during the construction phase of the largest mass timber residential structure in the United States, Adohi Hall. This building is a 200,000 square foot 708-bed complex located on the campus of the University of Arkansas. The energy used for buildings’ operation is the most dominant source of emissions in the building industry. Lately, however, the efforts were successful at increasing the efficiency of building operation in terms of emissions. As a result, the attention is now shifted to the embodied carbon, which is more noticeable in the building life cycle. Unfortunately, most of the studies have, however, focused on the manufacturing stage, and only a few have addressed to date the construction process. Specifically, less data is available about environmental impacts associated with the construction of mass timber. This study presents, therefore, an assessment of the environmental impact of the construction processes based on the real and newly built mass timber building mentioned above. The system boundary of this study covers modules A4 and A5 based on building LCA standard EN 15978. Module A4 includes material and equipment transportation. Module A5 covers the construction and installation process. This research evolves through 2 stages: first, to quantify materials and equipment deployed in the building, and second, to determine the embodied carbon associated with running equipment for construction materials, both transported to, and installed on, the site where the edifice is built. The Global Warming Potential (GWP) of the building is the primary metric considered in this research. The outcomes of this study bring to the front a better understanding of hotspots in terms of emission during the construction process. Moreover, the comparative analysis of the mass timber construction process with that of a theoretically similar steel building will enable an effective assessment of the environmental efficiency of mass timber.

Keywords: construction process, GWP, LCA, mass timber

Procedia PDF Downloads 166
7429 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 41
7428 Electro-Hydrodynamic Effects Due to Plasma Bullet Propagation

Authors: Panagiotis Svarnas, Polykarpos Papadopoulos

Abstract:

Atmospheric-pressure cold plasmas continue to gain increasing interest for various applications due to their unique properties, like cost-efficient production, high chemical reactivity, low gas temperature, adaptability, etc. Numerous designs have been proposed for these plasmas production in terms of electrode configuration, driving voltage waveform and working gas(es). However, in order to exploit most of the advantages of these systems, the majority of the designs are based on dielectric-barrier discharges (DBDs) either in filamentary or glow regimes. A special category of the DBD-based atmospheric-pressure cold plasmas refers to the so-called plasma jets, where a carrier noble gas is guided by the dielectric barrier (usually a hollow cylinder) and left to flow up to the atmospheric air where a complicated hydrodynamic interplay takes place. Although it is now well established that these plasmas are generated due to ionizing waves reminding in many ways streamer propagation, they exhibit discrete characteristics which are better mirrored on the terms 'guided streamers' or 'plasma bullets'. These 'bullets' travel with supersonic velocities both inside the dielectric barrier and the channel formed by the noble gas during its penetration into the air. The present work is devoted to the interpretation of the electro-hydrodynamic effects that take place downstream of the dielectric barrier opening, i.e., in the noble gas-air mixing area where plasma bullet propagate under the influence of local electric fields in regions of variable noble gas concentration. Herein, we focus on the role of the local space charge and the residual ionic charge left behind after the bullet propagation in the gas flow field modification. The study communicates both experimental and numerical results, coupled in a comprehensive manner. The plasma bullets are here produced by a custom device having a quartz tube as a dielectric barrier and two external ring-type electrodes driven by sinusoidal high voltage at 10 kHz. Helium gas is fed to the tube and schlieren photography is employed for mapping the flow field downstream of the tube orifice. Mixture mass conservation equation, momentum conservation equation, energy conservation equation in terms of temperature and helium transfer equation are simultaneously solved, leading to the physical mechanisms that govern the experimental results. Namely, we deal with electro-hydrodynamic effects mainly due to momentum transfer from atomic ions to neutrals. The atomic ions are left behind as residual charge after the bullet propagation and gain energy from the locally created electric field. The electro-hydrodynamic force is eventually evaluated.

Keywords: atmospheric-pressure plasmas, dielectric-barrier discharges, schlieren photography, electro-hydrodynamic force

Procedia PDF Downloads 139
7427 Automatic Detection of Sugarcane Diseases: A Computer Vision-Based Approach

Authors: Himanshu Sharma, Karthik Kumar, Harish Kumar

Abstract:

The major problem in crop cultivation is the occurrence of multiple crop diseases. During the growth stage, timely identification of crop diseases is paramount to ensure the high yield of crops, lower production costs, and minimize pesticide usage. In most cases, crop diseases produce observable characteristics and symptoms. The Surveyors usually diagnose crop diseases when they walk through the fields. However, surveyor inspections tend to be biased and error-prone due to the nature of the monotonous task and the subjectivity of individuals. In addition, visual inspection of each leaf or plant is costly, time-consuming, and labour-intensive. Furthermore, the plant pathologists and experts who can often identify the disease within the plant according to their symptoms in early stages are not readily available in remote regions. Therefore, this study specifically addressed early detection of leaf scald, red rot, and eyespot types of diseases within sugarcane plants. The study proposes a computer vision-based approach using a convolutional neural network (CNN) for automatic identification of crop diseases. To facilitate this, firstly, images of sugarcane diseases were taken from google without modifying the scene, background, or controlling the illumination to build the training dataset. Then, the testing dataset was developed based on the real-time collected images from the sugarcane field from India. Then, the image dataset is pre-processed for feature extraction and selection. Finally, the CNN-based Visual Geometry Group (VGG) model was deployed on the training and testing dataset to classify the images into diseased and healthy sugarcane plants and measure the model's performance using various parameters, i.e., accuracy, sensitivity, specificity, and F1-score. The promising result of the proposed model lays the groundwork for the automatic early detection of sugarcane disease. The proposed research directly sustains an increase in crop yield.

Keywords: automatic classification, computer vision, convolutional neural network, image processing, sugarcane disease, visual geometry group

Procedia PDF Downloads 116
7426 Real-Time Finger Tracking: Evaluating YOLOv8 and MediaPipe for Enhanced HCI

Authors: Zahra Alipour, Amirreza Moheb Afzali

Abstract:

In the field of human-computer interaction (HCI), hand gestures play a crucial role in facilitating communication by expressing emotions and intentions. The precise tracking of the index finger and the estimation of joint positions are essential for developing effective gesture recognition systems. However, various challenges, such as anatomical variations, occlusions, and environmental influences, hinder optimal functionality. This study investigates the performance of the YOLOv8m model for hand detection using the EgoHands dataset, which comprises diverse hand gesture images captured in various environments. Over three training processes, the model demonstrated significant improvements in precision (from 88.8% to 96.1%) and recall (from 83.5% to 93.5%), achieving a mean average precision (mAP) of 97.3% at an IoU threshold of 0.7. We also compared YOLOv8m with MediaPipe and an integrated YOLOv8 + MediaPipe approach. The combined method outperformed the individual models, achieving an accuracy of 99% and a recall of 99%. These findings underscore the benefits of model integration in enhancing gesture recognition accuracy and localization for real-time applications. The results suggest promising avenues for future research in HCI, particularly in augmented reality and assistive technologies, where improved gesture recognition can significantly enhance user experience.

Keywords: YOLOv8, mediapipe, finger tracking, joint estimation, human-computer interaction (HCI)

Procedia PDF Downloads 5
7425 Investigation of Enterotoxigenic Staphylococcus aureus in Kitchen of Catering

Authors: Çiğdem Sezer, Aksem Aksoy, Leyla Vatansever

Abstract:

This study has been done for the purpose of evaluation of public health and identifying of enterotoxigenic Staphyloccocus aureus in kitchen of catering. In the kitchen of catering, samples have been taken by swabs from surface of equipments which are in the salad section, meat section and bakery section. Samples have been investigated with classical cultural methods in terms of Staphyloccocus aureus. Therefore, as a 10x10 cm area was identified (salad, cutting and chopping surfaces, knives, meat grinder, meat chopping surface) samples have been taken with sterile swabs with helping FTS from this area. In total, 50 samples were obtained. In aseptic conditions, Baird-Parker agar (with egg yolk tellurite) surface was seeded with swabs. After 24-48 hours of incubation at 37°C, the black colonies with 1-1.5 mm diameter and which are surrounded by a zone indicating lecithinase activity were identified as S. aureus after applying Gram staining, catalase, coagulase, glucose and mannitol fermentation and termonuclease tests. Genotypic characterization (Staphylococcus genus and S.aureus species spesific) of isolates was performed by PCR. The ELISA test was applied to the isolates for the identification of staphylococcal enterotoxins (SET) A, B, C, D, E in bacterial cultures. Measurements were taken at 450 nm in an ELISA reader using an Ridascreen-Total set ELISA test kit (r-biopharm R4105-Enterotoxin A, B, C, D, E). The results were calculated according to the manufacturer’s instructions. A total of 50 samples of 97 S. aureus was isolated. This number has been identified as 60 with PCR analysis. According to ELISA test, only 1 of 60 isolates were found to be enterotoxigenic. Enterotoxigenic strains were identified from the surface of salad chopping and cutting. In the kitchen of catering, S. aureus identification indicates a significant source of contamination. Especially, in raw consumed salad preparation phase of contamination is very important. This food can be a potential source of food-borne poisoning their terms, and they pose a significant risk to consumers have been identified.

Keywords: Staphylococcus aureus, enterotoxin, catering, kitchen, health

Procedia PDF Downloads 402
7424 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation

Procedia PDF Downloads 325
7423 Fixed Points of Contractive-Like Operators by a Faster Iterative Process

Authors: Safeer Hussain Khan

Abstract:

In this paper, we prove a strong convergence result using a recently introduced iterative process with contractive-like operators. This improves and generalizes corresponding results in the literature in two ways: the iterative process is faster, operators are more general. In the end, we indicate that the results can also be proved with the iterative process with error terms.

Keywords: contractive-like operator, iterative process, fixed point, strong convergence

Procedia PDF Downloads 433
7422 Chemical Life Cycle Alternative Assessment as a Green Chemical Substitution Framework: A Feasibility Study

Authors: Sami Ayad, Mengshan Lee

Abstract:

The Sustainable Development Goals (SDGs) were designed to be the best possible blueprint to achieve peace, prosperity, and overall, a better and more sustainable future for the Earth and all its people, and such a blueprint is needed more than ever. The SDGs face many hurdles that will prevent them from becoming a reality, one of such hurdles, arguably, is the chemical pollution and unintended chemical impacts generated through the production of various goods and resources that we consume. Chemical Alternatives Assessment has proven to be a viable solution for chemical pollution management in terms of filtering out hazardous chemicals for a greener alternative. However, the current substitution practice lacks crucial quantitative datasets (exposures and life cycle impacts) to ensure no unintended trade-offs occur in the substitution process. A Chemical Life Cycle Alternative Assessment (CLiCAA) framework is proposed as a reliable and replicable alternative to Life Cycle Based Alternative Assessment (LCAA) as it integrates chemical molecular structure analysis and Chemical Life Cycle Collaborative (CLiCC) web-based tool to fill in data gaps that the former frameworks suffer from. The CLiCAA framework consists of a four filtering layers, the first two being mandatory, with the final two being optional assessment and data extrapolation steps. Each layer includes relevant impact categories of each chemical, ranging from human to environmental impacts, that will be assessed and aggregated into unique scores for overall comparable results, with little to no data. A feasibility study will demonstrate the efficiency and accuracy of CLiCAA whilst bridging both cancer potency and exposure limit data, hoping to provide the necessary categorical impact information for every firm possible, especially those disadvantaged in terms of research and resource management.

Keywords: chemical alternative assessment, LCA, LCAA, CLiCC, CLiCAA, chemical substitution framework, cancer potency data, chemical molecular structure analysis

Procedia PDF Downloads 92
7421 An Application of E-Learning Technology for Students with Deafness and Hearing Impairment

Authors: Eyup Bayram Guzel

Abstract:

There have been growing awareness that technology offers unique and promising advantages by offering up-to-data educational materials in promoting teaching and learning materials, new strategies for building enhanced communication environment for people with disabilities and specifically for this study concentrated on the students with deafness and hearing impairments. Creating e-learning environment where teachers and students work in collaboration to develop better educational outcomes is the foremost reason of conducting this research. This study examined the perspectives of special education teachers’ regarding an application of e-learning software called Multimedia Builder on the students with deafness and hearing impairments. Initial and follow up interviews were conducted with 15 special education teachers around the scope of qualitative case study. Grounded approach has been used to analyse and interpret the data. The research results revealed that application of Multimedia Builder software were influential on reading, sign language, vocabulary improvements, computer and ICT usage developments and on audio-visual learning achievements for the advantages of students with deafness and hearing impairments. The implications of the study encouraged the ways of using e-learning tools and strategies to promote unique and comprehensive learning experiences for the targeted students and their teachers.

Keywords: e-learning, special education, deafness and hearing impairment, computer-ICT usage.

Procedia PDF Downloads 438
7420 Cryoinjuries in Sperm Cells: Effect of Adaptation of Steps in Cryopreservation Protocol for Boar Semen upon Post-Thaw Sperm Quality

Authors: Aftab Ali

Abstract:

Cryopreservation of semen is one of the key factors for a successful breeding business along with other factors. To achieve high fertility in boar, one should know about spermatozoa response to different treatments proceeds during cryopreservation. The running project is highly focused on cryopreservation and its effects on sperm quality parameters in both boar and bull semen. Semen sample from A, B, C, and D, were subjected to different thawing conditions and were analyzed upon different treatments in the study. Parameters like sperm cell motility, viability, acrosome, DNA integrity, and phospholipase C zeta were detected by different established methods. Different techniques were used to assess different parameters. Motility was detected using computer assisted sperm analysis, phospholipase C zeta using luminometry while viability, acrosome integrity, and DNA integrity were analyzed using flow cytometry. Thawing conditions were noted to have an effect on sperm quality parameters with motility being the most critical parameter. The results further indicated that the most critical step during cryopreservation of boar semen is when sperm cells are subjected to freezing and thawing. The findings of the present study provide insight that; boar semen cryopreservation is still suboptimal in comparison to bull semen cryopreservation. Thus, there is a need to conduct more research to improve the fertilizing potential of cryopreserved boar semen.

Keywords: cryopreservation, computer assisted sperm, flow cytometry, luminometry

Procedia PDF Downloads 148
7419 Analyzing the Sociolinguistic Profile of the Algerian Community in the UK in terms of French Language Use: The Case of Émigré Ph.D. Students

Authors: Hadjer Chellia

Abstract:

the present study reports on second language use among Algerian international students in the UK. In Algeria, French has an important status among the Algerian verbal repertoires due to colonial reasons. This has triggered many language conflicts and many debates among policy makers in Algeria. In higher education, Algerian English students’ sociolinguistic profile is characterised by the use of French as a sign of prestige. What may leave room for debate is the effect of crossing borders towards the UK as a result of international mobility programmes, a transition which could add more complexity since French, is not so significant as a language in the UK context. In this respect, the micro-objective is to explore the fate of French use among Ph.D. students in the UK as a newly established group vis-à-vis English. To fulfill the purpose of the present inquiry, the research employs multiple approaches in which semi-structured interview is a primary source of data to know participants’ attitudes about French use, targeting both their pre-migratory experience and current one. Web-based questionnaires are set up to access larger population. Focus group sessions are further procedures of scrutiny in this piece of work to explore the actual linguistic behaviours. Preliminary findings from both interviews and questionnaires reveal that students’ current experience, particularly living in the UK, affects their pre-migratory attitudes towards French language and its use. The overall findings are expected to bring manifold contributions to the field of research among which is setting factors that influence language use among newly established émigrés communities. The research is also relevant to international students’ experience of study abroad in terms of language use in the guise of internationalization of higher education, mobility and exchange programmes. It could contribute to the sociolinguistics of the Algerian diaspora: the dispersed residence of non-native communities - not to mention its significance on the Algerian research field abroad.

Keywords: Algerian diaspora, French language, language maintenance, language shift, mobility

Procedia PDF Downloads 343
7418 The Integrated Urban Regeneration Implemented through the Reuse, Enhancement and Transformation of Disused Industrial Areas

Authors: Sara Piccirillo

Abstract:

The integrated urban regeneration represents a great opportunity to deliver correct management of the territory if implemented through the reuse, enhancement, and transformation of abandoned industrial areas, according to sustainability strategies. In environmental terms, recycling abandoned sites by demolishing buildings and regenerating the urban areas means promoting adaptation to climate change and a new sensitivity towards city living. The strategic vision of 'metabolism' can be implemented through diverse actions made on urban settlements, and planning certainly plays a primary role. Planning an urban transformation in a sustainable way is more than auspicable. It is necessary to introduce innovative urban soil management actions to mitigate the environmental costs associated with current land use and to promote projects for the recovery/renaturalization of urban or non-agricultural soils. However, by freeing up these through systematic demolition of the disused heritage, new questions open up in terms of environmental costs deriving from the inevitable impacts caused by the disposal of waste. The mitigation of these impacts involves serious reflection on the recycling supply chains aimed at the production and reuse of secondary raw materials in the construction industry. The recent developments in R&D of recycling materials are gradually becoming more and more pivotal in consideration of environmental issues such as increasing difficulties in exploiting natural quarries or strict regulations for the management and disposal of waste sites. Therefore, this contribution, set as a critical essay, presents the reconstruction outputs of the regulatory background on the material recycling chain up to the 'end of waste' stage, both at a national and regional scale. This extended approach to this urban design practice goes beyond the cultural dimension that has relegated urban regeneration to pure design only. It redefines its processes through an interdisciplinary system that affects human, environmental and financial resources.

Keywords: waste management, C&D waste, recycling, urban trasformation

Procedia PDF Downloads 213
7417 A Study to Identify Resistant Hypertension and Role of Spironolactone in its Management

Authors: A. Kumar, D. Himanshu, Ak Vaish, K. Usman , A. Singh, R. Misra, V. Atam, S. P. Verma, S. Singhal

Abstract:

Introduction: Resistant and uncontrolled hypertension offer great challenge, in terms of higher risk of morbidity, mortality and not the least, difficulty in diagnosis and management. Our study tries to identify the importance of two crucial aspects of hypertension management, i.e. drug compliance and optimum dosing and also the effect of spironolactone on blood pressure in cases of resistant hypertension. Methodology: A prospective study was carried out among patients, who were referred as case of resistant hypertension to Hypertension Clinic at Gandhi memorial and associated hospital, Lucknow, India from August, 2013 to July 2014. A total of 122 Subjects having uncontrolled BP with ≥3 antihypertensives were selected. After ruling out secondary resistance and with appropriate lifestyle modifications, effect of adherence and optimum doses was seen with monitoring of BP. Only those having blood pressure still uncontrolled were true resistant. These patients were given spironolactone to see its effect on BP over next 12 weeks. Results: Mean baseline BP of all (n=122) patients was 150.4±7.2 mmHg systolic and 92.1±5.7 mmHg diastolic. After promoting adherence to the regimen, there was reduction of 4.20±3.65 mmHg systolic and 2.08±4.74 mmHg Diastolic blood pressure, with 26 patients achieving target blood pressure goal. Further reduction of 6.66±5.99 mmHg in systolic and 2.59±3.67 mmHg in diastolic BP was observed after optimizing the drug doses with another 66 patients achieving target blood pressure goal. Only 30 patients were true resistant hypertensive and prescribed spironolactone. Over 12 weeks, mean reduction of 20.62±3.65 mmHg in systolic and 10.08 ± 6.46 mmHg in diastolic BP was observed. Out of these 30, BP was controlled in 24 patients. Side effects observed were hyperkalemia in 2 patients and breast tenderness in 2 patients. Conclusion: Improper adherence and suboptimal regimen appear to be the important reasons for uncontrolled hypertension. By virtue of maintaining proper adherence to an optimum regimen, target BP goal can be reached in many without adding much to the regimen. Spironolactone is effective in patients with resistant hypertension, in terms of blood pressure reduction with minimal side effects.

Keywords: resistant, hypertension, spironolactone, blood pressure

Procedia PDF Downloads 278
7416 Control of Oil Content of Fried Zucchini Slices by Partial Predrying and Process Optimization

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Main concern about deep-fat-fried food materials is their high final oil contents absorbed during frying process and/or after cooling period, since diet including high content of oil is accepted unhealthy by consumers. Different methods have been evaluated to decrease oil content of fried food stuffs. One promising method is partially drying of food material before frying. In the present study it was aimed to control and decrease the final oil content of zucchini slices by means of partial drying and to optimize process conditions. Conventional oven drying was used to decrease moisture content of zucchini slices at a certain extent. Process performance in terms of oil uptake was evaluated by comparing oil content of predried and then fried zucchini slices with those determined for directly fried ones. For predrying and frying processes, oven temperature and weight loss and frying oil temperature and time pairs were controlled variables, respectively. Zucchini slices were also directly fried for sensory evaluations revealing preferred properties of final product in terms of surface color, moisture content, texture and taste. These properties of directly fried zucchini slices taking the highest score at the end of sensory evaluation were determined and used as targets in optimization procedure. Response surface methodology was used for process optimization. The properties, determined after sensory evaluation, were selected as targets; meanwhile oil content was aimed to be minimized. Results indicated that final oil content of zucchini slices could be reduced from 58% to 46% by controlling conditions of predrying and frying processes. As a result, it was suggested that predrying could be one choose to reduce oil content of fried zucchini slices for health diet. This project (113R015) has been supported by TUBITAK.

Keywords: health process, optimization, response surface methodology, oil uptake, conventional oven

Procedia PDF Downloads 366
7415 Pedagogical Variation with Computers in Mathematics Classrooms: A Cultural Historical Activity Theory Analysis

Authors: Joanne Hardman

Abstract:

South Africa’s crisis in mathematics attainment is well documented. To meet the need to develop students’ mathematical performance in schools the government has launched various initiatives using computers to impact on mathematical attainment. While it is clear that computers can change pedagogical practices, there is a dearth of qualitative studies indicating exactly how pedagogy is transformed with Information Communication Technologies (ICTs) in a teaching activity. Consequently, this paper addresses the following question: how, along which dimensions in an activity, does pedagogy alter with the use of computer drill and practice software in four disadvantaged grade 6 mathematics classrooms in the Western Cape province of South Africa? The paper draws on Cultural Historical Activity Theory (CHAT) to develop a view of pedagogy as socially situated. Four ideal pedagogical types are identified: Reinforcement pedagogy, which has the reinforcement of specialised knowledge as its object; Collaborative pedagogy, which has the development of metacognitive engagement with specialised knowledge as its object; Directive pedagogy, which has the development of technical task skills as its object, and finally, Defensive pedagogy, which has student regulation as its object. Face-to-face lessons were characterised as predominantly Reinforcement and Collaborative pedagogy and most computer lessons were characterised as mainly either Defensive or Directive.

Keywords: computers, cultural historical activity theory, mathematics, pedagogy

Procedia PDF Downloads 281
7414 Verification of a Simple Model for Rolling Isolation System Response

Authors: Aarthi Sridhar, Henri Gavin, Karah Kelly

Abstract:

Rolling Isolation Systems (RISs) are simple and effective means to mitigate earthquake hazards to equipment in critical and precious facilities, such as hospitals, network collocation facilities, supercomputer centers, and museums. The RIS works by isolating components acceleration the inertial forces felt by the subsystem. The RIS consists of two platforms with counter-facing concave surfaces (dishes) in each corner. Steel balls lie inside the dishes and allow the relative motion between the top and bottom platform. Formerly, a mathematical model for the dynamics of RISs was developed using Lagrange’s equations (LE) and experimentally validated. A new mathematical model was developed using Gauss’s Principle of Least Constraint (GPLC) and verified by comparing impulse response trajectories of the GPLC model and the LE model in terms of the peak displacements and accelerations of the top platform. Mathematical models for the RIS are tedious to derive because of the non-holonomic rolling constraints imposed on the system. However, using Gauss’s Principle of Least constraint to find the equations of motion removes some of the obscurity and yields a system that can be easily extended. Though the GPLC model requires more state variables, the equations of motion are far simpler. The non-holonomic constraint is enforced in terms of accelerations and therefore requires additional constraint stabilization methods in order to avoid the possibility that numerical integration methods can cause the system to go unstable. The GPLC model allows the incorporation of more physical aspects related to the RIS, such as contribution of the vertical velocity of the platform to the kinetic energy and the mass of the balls. This mathematical model for the RIS is a tool to predict the motion of the isolation platform. The ability to statistically quantify the expected responses of the RIS is critical in the implementation of earthquake hazard mitigation.

Keywords: earthquake hazard mitigation, earthquake isolation, Gauss’s Principle of Least Constraint, nonlinear dynamics, rolling isolation system

Procedia PDF Downloads 250
7413 Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain Computer Interface Methods

Authors: Bayar Shahab

Abstract:

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems, and issues of this new era have been found and are being found like no other time in history. Brain-computer interface so-called BCI has opened the door to several new research areas and have been able to provide solutions to critical and important issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair or even a car and neurotechnology enabled the rehabilitation of the lost memory, etc. This review work presents state-of-the-art methods and improvements of canonical correlation analyses (CCA), which is an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said in a different way, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers to understand the most state-of-the-art methods available in this field with their pros and cons, along with their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the prominent methods used in this field in a hierarchical way (2) explaining pros and cons of each method and their performance (3) presenting the gaps that exist at the end of each method that can open the understanding and doors to new research and/or improvements.

Keywords: BCI, CCA, SSVEP, EEG

Procedia PDF Downloads 145
7412 Analysis of Extreme Case of Urban Heat Island Effect and Correlation with Global Warming

Authors: Kartikey Gupta

Abstract:

Global warming and environmental degradation are at their peak today, with the years after 2000A.D. giving way to 15 hottest years in terms of average temperatures. In India, much of the standard temperature measuring equipment are located in ‘developed’ urban areas, hence showing us an incomplete picture in terms of the climate across many rural areas, which comprises most of the landmass. This study showcases data studied by the author since 3 years at Vatsalya’s Children’s village, in outskirts of Jaipur, Rajasthan, India; in the midst of semi-arid topography, where consistently huge temperature differences of up to 15.8 degrees Celsius from local Jaipur weather only 30 kilometers away, are stunning yet scary at the same time, encouraging analysis of where the natural climatic pattern is heading due to rapid unrestricted urbanization. Record-breaking data presented in this project enforces the need to discuss causes and recovery techniques. This research further explores how and to what extent we are causing phenomenal disturbances in the natural meteorological pattern by urban growth. Detailed data observations using a standardized ambient weather station at study site and comparing it with closest airport weather data, evaluating the patterns and differences, show striking differences in temperatures, wind patterns and even rainfall quantity, especially during high-pressure zone days. Winter-time lows dip to 8 degrees below freezing with heavy frost and ice, while only 30 kms away minimum figures barely touch single-digit temperatures. Human activity is having an unprecedented effect on climatic patterns in record-breaking trends, which is a warning of what may follow in the next 15-25 years for the next generation living in cities, and a serious exploration into possible solutions is a must.

Keywords: climate change, meteorology, urban heat island, urbanization

Procedia PDF Downloads 85
7411 Theoretical-Methodological Model to Study Vulnerability of Death in the Past from a Bioarchaeological Approach

Authors: Geraldine G. Granados Vazquez

Abstract:

Every human being is exposed to the risk of dying; wherein some of them are more susceptible than others depending on the cause. Therefore, the cause could be the hazard to die that a group or individual has, making this irreversible damage the condition of vulnerability. Risk is a dynamic concept; which means that it depends on the environmental, social, economic and political conditions. Thus vulnerability may only be evaluated in terms of relative parameters. This research is focusing specifically on building a model that evaluate the risk or propensity of death in past urban societies in connection with the everyday life of individuals, considering that death can be a consequence of two coexisting issues: hazard and the deterioration of the resistance to destruction. One of the most important discussions in bioarchaeology refers to health and life conditions in ancient groups; the researchers are looking for more flexible models that evaluate these topics. In that way, this research proposes a theoretical-methodological model that assess the vulnerability of death in past urban groups. This model pretends to be useful to evaluate the risk of death, considering their sociohistorical context, and their intrinsic biological features. This theoretical and methodological model, propose four areas to assess vulnerability. The first three areas use statistical methods or quantitative analysis. While the last and fourth area, which corresponds to the embodiment, is based on qualitative analysis. The four areas and their techniques proposed are a) Demographic dynamics. From the distribution of age at the time of death, the analysis of mortality will be performed using life tables. From here, four aspects may be inferred: population structure, fertility, mortality-survival, and productivity-migration, b) Frailty. Selective mortality and heterogeneity in frailty can be assessed through the relationship between characteristics and the age at death. There are two indicators used in contemporary populations to evaluate stress: height and linear enamel hypoplasias. Height estimates may account for the individual’s nutrition and health history in specific groups; while enamel hypoplasias are an account of the individual’s first years of life, c) Inequality. Space reflects various sectors of society, also in ancient cities. In general terms, the spatial analysis uses measures of association to show the relationship between frail variables and space, d) Embodiment. The story of everyone leaves some evidence on the body, even in the bones. That led us to think about the dynamic individual's relations in terms of time and space; consequently, the micro analysis of persons will assess vulnerability from the everyday life, where the symbolic meaning also plays a major role. In sum, using some Mesoamerica examples, as study cases, this research demonstrates that not only the intrinsic characteristics related to the age and sex of individuals are conducive to vulnerability, but also the social and historical context that determines their state of frailty before death. An attenuating factor for past groups is that some basic aspects –such as the role they played in everyday life– escape our comprehension, and are still under discussion.

Keywords: bioarchaeology, frailty, Mesoamerica, vulnerability

Procedia PDF Downloads 225
7410 Kansei Engineering Applied to the Design of Rural Primary Education Classrooms: Design-Based Learning Case

Authors: Jimena Alarcon, Andrea Llorens, Gabriel Hernandez, Maritza Palma, Lucia Navarrete

Abstract:

The research has funding from the Government of Chile and is focused on defining the design of rural primary classroom that stimulates creativity. The relevance of the study consists of its capacity to define adequate educational spaces for the implementation of the design-based learning (DBL) methodology. This methodology promotes creativity and teamwork, generating a meaningful learning experience for students, based on the appreciation of their environment and the generation of projects that contribute positively to their communities; also, is an inquiry-based form of learning that is based on the integration of design thinking and the design process into the classroom. The main goal of the study is to define the design characteristics of rural primary school classrooms, associated with the implementation of the DBL methodology. Along with the change in learning strategies, it is necessary to change the educational spaces in which they develop. The hypothesis indicates that a change in the space and equipment of the classrooms based on the emotions of the students will motivate better learning results based on the implementation of a new methodology. In this case, the pedagogical dynamics require an important interaction between the participants, as well as an environment favorable to creativity. Methodologies from Kansei engineering are used to know the emotional variables associated with their definition. The study is done to 50 students between 6 and 10 years old (average age of seven years), 48% of men and 52% women. Virtual three-dimensional scale models and semantic differential tables are used. To define the semantic differential, self-applied surveys were carried out. Each survey consists of eight separate questions in two groups: question A to find desirable emotions; question B related to emotions. Both questions have a maximum of three alternatives to answer. Data were tabulated with IBM SPSS Statistics version 19. Terms referred to emotions are grouped into twenty concepts with a higher presence in surveys. To select the values obtained as part of the implementation of Semantic Differential, a number expected of 'chi-square test (x2)' frequency calculated for classroom space is considered lower limit. All terms over the N expected a cut point, are included to prepare tables for surveys to find a relation between emotion and space. Statistic contrast (Chi-Square) represents significance level ≥ 0, indicator that frequencies appeared are not random. Then, the most representative terms depend on the variable under study: a) definition of textures and color of vertical surfaces is associated with emotions such as tranquility, attention, concentration, creativity; and, b) distribution of the equipment of the rooms, with emotions associated with happiness, distraction, creativity, freedom. The main findings are linked to the generation of classrooms according to diverse DBL team dynamics. Kansei engineering is the appropriate methodology to know the emotions that students want to feel in the classroom space.

Keywords: creativity, design-based learning, education spaces, emotions

Procedia PDF Downloads 142
7409 High Input Driven Factors in Idea Campaigns in Large Organizations: A Case Depicting Best Practices

Authors: Babar Rasheed, Saad Ghafoor

Abstract:

Introduction: Idea campaigns are commonly held across organizations for generating employee engagement. The contribution is specifically designed to identify and solve prevalent issues. It is argued that numerous organizations fail to achieve their desired goals despite arranging for such campaigns and investing heavily in them. There are however practices that organizations use to achieve higher degree of effectiveness, and these practices may be up for exploration by research to make them usable for the other organizations. Purpose: The aim of this research is to surface the idea management practices of a leading electric company with global operations. The study involves a large sized, multi site organization that is attributed to have added challenges in terms of managing ideas from employees, in comparison to smaller organizations. The study aims to highlight the factors that are looked at as the idea management team strategies for the campaign, sets terms and rewards for it, makes follow up with the employees and lastly, evaluate and award ideas. Methodology: The study is conducted in a leading electric appliance corporation that has a large number of employees and is based in numerous regions of the world. A total of 7 interviews are carried out involving the chief innovation officer, innovation manager and members of idea management and evaluation teams. The interviews are carried out either on Skype or in-person based on the availability of the interviewee. Findings: While this being a working paper and while the study is under way, it is anticipated that valuable information is being achieved about specific details on how idea management systems are governed and how idea campaigns are carried out. The findings may be particularly useful for innovation consultants as resources they can use to promote idea campaigning. The usefulness of the best practices highlighted as a result is, in any case, the most valuable output of this study.

Keywords: employee engagement, motivation, idea campaigns, large organizations, best practices, employees input, organizational output

Procedia PDF Downloads 173
7408 Regularized Euler Equations for Incompressible Two-Phase Flow Simulations

Authors: Teng Li, Kamran Mohseni

Abstract:

This paper presents an inviscid regularization technique for the incompressible two-phase flow simulations. This technique is known as observable method due to the understanding of observability that any feature smaller than the actual resolution (physical or numerical), i.e., the size of wire in hotwire anemometry or the grid size in numerical simulations, is not able to be captured or observed. Differ from most regularization techniques that applies on the numerical discretization, the observable method is employed at PDE level during the derivation of equations. Difficulties in the simulation and analysis of realistic fluid flow often result from discontinuities (or near-discontinuities) in the calculated fluid properties or state. Accurately capturing these discontinuities is especially crucial when simulating flows involving shocks, turbulence or sharp interfaces. Over the past several years, the properties of this new regularization technique have been investigated that show the capability of simultaneously regularizing shocks and turbulence. The observable method has been performed on the direct numerical simulations of shocks and turbulence where the discontinuities are successfully regularized and flow features are well captured. In the current paper, the observable method will be extended to two-phase interfacial flows. Multiphase flows share the similar features with shocks and turbulence that is the nonlinear irregularity caused by the nonlinear terms in the governing equations, namely, Euler equations. In the direct numerical simulation of two-phase flows, the interfaces are usually treated as the smooth transition of the properties from one fluid phase to the other. However, in high Reynolds number or low viscosity flows, the nonlinear terms will generate smaller scales which will sharpen the interface, causing discontinuities. Many numerical methods for two-phase flows fail at high Reynolds number case while some others depend on the numerical diffusion from spatial discretization. The observable method regularizes this nonlinear mechanism by filtering the convective terms and this process is inviscid. The filtering effect is controlled by an observable scale which is usually about a grid length. Single rising bubble and Rayleigh-Taylor instability are studied, in particular, to examine the performance of the observable method. A pseudo-spectral method is used for spatial discretization which will not introduce numerical diffusion, and a Total Variation Diminishing (TVD) Runge Kutta method is applied for time integration. The observable incompressible Euler equations are solved for these two problems. In rising bubble problem, the terminal velocity and shape of the bubble are particularly examined and compared with experiments and other numerical results. In the Rayleigh-Taylor instability, the shape of the interface are studied for different observable scale and the spike and bubble velocities, as well as positions (under a proper observable scale), are compared with other simulation results. The results indicate that this regularization technique can potentially regularize the sharp interface in the two-phase flow simulations

Keywords: Euler equations, incompressible flow simulation, inviscid regularization technique, two-phase flow

Procedia PDF Downloads 502
7407 Modeling of Leaks Effects on Transient Dispersed Bubbly Flow

Authors: Mohand Kessal, Rachid Boucetta, Mourad Tikobaini, Mohammed Zamoum

Abstract:

Leakage problem of two-component fluids flow is modeled for a transient one-dimensional homogeneous bubbly flow and developed by taking into account the effect of a leak located at the middle point of the pipeline. The corresponding three conservation equations are numerically resolved by an improved characteristic method. The obtained results are explained and commented in terms of physical impact on the flow parameters.

Keywords: fluid transients, pipelines leaks, method of characteristics, leakage problem

Procedia PDF Downloads 478