Search results for: Yash Virani
17 Assessment of the Indices in Converting Affect Rural to Urban Settlements Case Study: Torqabe and Shandiz Rural Districts in Iran
Authors: Fahimeh Khatami, Elham Sanagar Darbani, Behnosh Khir Khah, R.Khatami
Abstract:
Rural and ruralism is one of the residential forms that form in special natural areas, and the Interaction between their internal and external forces cause developments and changes that are different in time and space. Over time, historical developments, social and economic changes in the political system cause developments and rapid growth of the rural to urban settlements. However, criteria for recognizing rural settlements to the city are different in every land. One of the problems in modern plan is inattention to indicators and criteria of changing these settlements to the city. The method of this research is a type of applied and compilation research and library and field methods are used in it. And also qualitative and quantitative indicators have been provided while collecting documents and studies from rural districts like Dehnow, Virani, Abardeh, Zoshk, Nowchah, Jaqarq in tourism area of Mashhad. In this research, the used tool is questionnaire and for analyzing quantitative variables by Morris and Mac Granahan examination, the importance of each factor and the development settlements are evaluated, and the rural that can convert to the city was defined. In result, according to Askalvgram curve obtained from analysis, it was found that among the mentioned villages, Virani and Nowchah rural districts have this ability to convert to the city; Zoshk rural district will be converting to the city in future and Dehnow, Abardeh and Jaqarq rural districts won’t be converting.Keywords: rural settlements, city, indicators, Torqabe and Shandiz rural districts
Procedia PDF Downloads 26716 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 10515 Concentration of Zinc Micronutrients in Breast Milk Based on Determinant of Mother and Baby in Kassi-Kassi Health Center
Authors: Andi Tenri Ayu Rahman, Citrakesumasari, Devintha Virani
Abstract:
Breast milk is the complex biological fluid mix of macronutrient and micronutrient that are considered as perfect food for babies. Zinc has a role in various biological functions and physical growth. This research aims to know the average zinc (Zn) micronutrients content of breast milk by determinants of infant (birth weight) and mother (nutritional status and food intake) and description of the pattern of mothers breastfeeding. The type of research used is observational analytic with cross-sectional study design. The population was 41 mothers in Kassi-Kassi health center within one month. Sample research is mothers who gave birth at term and breastfed her baby. Sampling was done with random sampling technique involving 37 people. Samples of breast milk were analyzed in the laboratory by using the method of Atomic Absorption Spectrofotometry (AAS). This research find that from the samples (n=37) the average contents of zinc in the breast milk is 0,88±0,54 mg/L with the highest value on the group of low birth weight babies (1,13 ± 0,67mg/L), mothers who had normal nutritional status (0,981 ± 0,514 mg/L) and intake low zinc (0,94 ± 0,54 mg/L). Regarding breastfeeding pattern, 67,6% of the samples had had breastfeeding experience and 81,1% of breastfed more than eight times a day. In summary, the highest average value of the zinc content of breast milk was in the group of low birth weight babies, mother with normal nutritional status, and mothers having relatively low intake pattern.Keywords: zinc, breastmilk, mother, baby
Procedia PDF Downloads 19114 Impact of Wheel-Housing on Aerodynamic Drag and Effect on Energy Consumption on an Bus
Authors: Amitabh Das, Yash Jain, Mohammad Rafiq B. Agrewale, K. C. Vora
Abstract:
Role of wheel and underbody aerodynamics of vehicle in the formation of drag forces is detrimental to the fuel (energy) consumption during the course of operation at high velocities. This paper deals with the CFD simulation of the flow around the wheels of a bus with different wheel housing geometry and pattern. Based on benchmarking a model of a bus is selected and analysis is performed. The aerodynamic drag coefficient is obtained and turbulence around wheels is observed using ANSYS Fluent CFD simulation for different combinations of wheel-housing at the front wheels, at the rear wheels and both in the front and rear wheels. The drag force is recorded and corresponding influence on energy consumption on an electric bus is evaluated mathematically. A comparison is drawn between energy consumption of bus body without wheel housing and bus body with wheel housing. The result shows a significant reduction in drag coefficient and fuel consumption.Keywords: wheel-housing, CFD simulation, drag coefficient, energy consumption
Procedia PDF Downloads 18313 Crumbed Rubber Modified Asphalt
Authors: Maanav M. Patel, Aarsh S. Mistry, Yash A. Dhaduk
Abstract:
Nowadays, only a small percentage of waste tyres are being land-filled. The Recycled Tyres Rubber is being used in new tyres, in tyre-derived fuel, in civil engineering applications and products, in molded rubber products, in agricultural uses, recreational and sports applications and in rubber modified asphalt applications. The benefits of using rubber modified asphalts are being more widely experienced and recognized, and the incorporation of tyres into asphalt is likely to increase. The technology with much different evidence of success demonstrated by roads built in the last 40 years is the rubberised asphalt mixture obtained through the so-called ‘‘wet process’’ which involves the utilisation of the Recycled Tyre Rubber Modified Bitumen (RTR-MBs). Since 1960s, asphalt mixtures produced with RTRMBs have been used in different parts of the world as solutions for different quality problems and, despite some downsides, in the majority of the cases they have demonstrated to enhance performance of road’s pavement. The present study aims in investigating the experimental performance of the bitumen modified with 15% by weight of crumb rubber varying its sizes. Four different categories of size of crumb rubber will be used, which are coarse (1 mm - 600 μm); medium size (600 μm - 300 μm); fine (300 μm150 μm); and superfine (150 μm - 75 μm). Common laboratory tests will be performed on the modified bitumen using various sizes of crumb rubber and thus analyzed. Marshall Stability method is adopted for mix design.Keywords: Bitumen, CRMB, Marshall Stability Test, Pavement
Procedia PDF Downloads 14212 To Design a Full Stack Online Educational Website Using HTML, CSS and Java Script
Authors: Yash Goyal, Manish Korde, Juned Siddiqui
Abstract:
Today online education has gained more popularity so that people can easily complete their curriculum on their own time. Virtual learning has been widely used by many educators, especially in higher education institutions due to its benefits to students and faculty. A good knowledge of teaching theory and instructional design systems is required to experience meaningful learning. However, most educational websites are not designed to adapt to all screen sizes. Making the website accessible on all screen sizes is our main objective, so we have created a website that is readily accessible across all screen sizes and accepts all types of payment methods. And we see generally educational websites interface is simple and unexciting. So, we have made a user interface attractive and user friendly. It is not enough for a website to be user-friendly, but also to be familiar to admins and to reduce the workload of the admin as well. We visited so many popular websites under development that they all had issues like responsiveness, simple interface, security measures, payment methods, etc. To overcome this limitation, we have created a website which has taken care of security issues that is why we have created only one admin id and it can be control from that only. And if the user has successfully done the payment, then the admin can send him a username and password through mail individually so there will no fraud in the payment of the course.Keywords: responsive, accessible, attractive, interface, objective, security.
Procedia PDF Downloads 10111 Empowering Certificate Management with Blockchain Technology
Authors: Yash Ambekar, Kapil Vhatkar, Prathamesh Swami, Kartikey Singh, Yashovardhan Kaware
Abstract:
The rise of online courses and certifications has created new opportunities for individuals to enhance their skills. However, this digital transformation has also given rise to coun- terfeit certificates. To address this multifaceted issue, we present a comprehensive certificate management system founded on blockchain technology and strengthened by smart contracts. Our system comprises three pivotal components: certificate generation, authenticity verification, and a user-centric digital locker for certificate storage. Blockchain technology underpins the entire system, ensuring the immutability and integrity of each certificate. The inclusion of a cryptographic hash for each certificate is a fundamental aspect of our design. Any alteration in the certificate’s data will yield a distinct hash, a powerful indicator of potential tampering. Furthermore, our system includes a secure digital locker based on cloud storage that empowers users to efficiently manage and access all their certificates in one place. Moreover, our project is committed to providing features for certificate revocation and updating, thereby enhancing the system’s flexibility and security. Hence, the blockchain and smart contract-based certificate management system offers a robust and one-stop solution to the escalating problem of counterfeit certificates in the digital era.Keywords: blockchain technology, smart contracts, counterfeit certificates, authenticity verification, cryptographic hash, digital locker
Procedia PDF Downloads 4510 Using Machine Learning to Classify Human Fetal Health and Analyze Feature Importance
Authors: Yash Bingi, Yiqiao Yin
Abstract:
Reduction of child mortality is an ongoing struggle and a commonly used factor in determining progress in the medical field. The under-5 mortality number is around 5 million around the world, with many of the deaths being preventable. In light of this issue, Cardiotocograms (CTGs) have emerged as a leading tool to determine fetal health. By using ultrasound pulses and reading the responses, CTGs help healthcare professionals assess the overall health of the fetus to determine the risk of child mortality. However, interpreting the results of the CTGs is time-consuming and inefficient, especially in underdeveloped areas where an expert obstetrician is hard to come by. Using a support vector machine (SVM) and oversampling, this paper proposed a model that classifies fetal health with an accuracy of 99.59%. To further explain the CTG measurements, an algorithm based on Randomized Input Sampling for Explanation ((RISE) of Black-box Models was created, called Feature Alteration for explanation of Black Box Models (FAB), and compared the findings to Shapley Additive Explanations (SHAP) and Local Interpretable Model Agnostic Explanations (LIME). This allows doctors and medical professionals to classify fetal health with high accuracy and determine which features were most influential in the process.Keywords: machine learning, fetal health, gradient boosting, support vector machine, Shapley values, local interpretable model agnostic explanations
Procedia PDF Downloads 1439 Development of Microsatellite Markers for Genetic Variation Analysis in House Cricket, Acheta domesticus
Authors: Yash M. Gupta, Kittisak Buddhachat, Surin Peyachoknagul, Somjit Homchan
Abstract:
The house cricket, Acheta domesticus is one of the commonly found species of field crickets. Although it is very commonly used as food and feed, the genomic information of house cricket is still missing for genetic investigation. DNA sequencing technology has evolved over the decades, and it has also revolutionized the molecular marker development for genetic analysis. In the present study, we have sequenced the whole genome of A. domesticus using illumina platform based HiSeq X Ten sequencing technology for searching simple sequence repeats (SSRs) in DNA to develop polymorphic microsatellite markers for population genetic analysis. A total of 112,157 SSRs with primer pairs were identified, 91 randomly selected SSRs used to check DNA amplification, of which nine primers were polymorphic. These microsatellite markers have shown cross-amplification with other three species of crickets which are Gryllus bimaculatus, Gryllus testaceus and Brachytrupes portentosus. These nine polymorphic microsatellite markers were used to check genetic variation for forty-five individuals of A. domesticus, Phitsanulok population, Thailand. For nine loci, the number of alleles was ranging from 5 to 15. The observed heterozygosity was ranged from 0.4091 to 0.7556. These microsatellite markers will facilitate population genetic analysis for future studies of A. domesticus populations. Moreover, the transferability of these SSR makers would also enable researchers to conduct genetic studies for other closely related species.Keywords: cross-amplification, microsatellite markers, observed heterozygosity, population genetic, simple sequence repeats
Procedia PDF Downloads 1388 Using Machine Learning to Build a Real-Time COVID-19 Mask Safety Monitor
Authors: Yash Jain
Abstract:
The US Center for Disease Control has recommended wearing masks to slow the spread of the virus. The research uses a video feed from a camera to conduct real-time classifications of whether or not a human is correctly wearing a mask, incorrectly wearing a mask, or not wearing a mask at all. Utilizing two distinct datasets from the open-source website Kaggle, a mask detection network had been trained. The first dataset that was used to train the model was titled 'Face Mask Detection' on Kaggle, where the dataset was retrieved from and the second dataset was titled 'Face Mask Dataset, which provided the data in a (YOLO Format)' so that the TinyYoloV3 model could be trained. Based on the data from Kaggle, two machine learning models were implemented and trained: a Tiny YoloV3 Real-time model and a two-stage neural network classifier. The two-stage neural network classifier had a first step of identifying distinct faces within the image, and the second step was a classifier to detect the state of the mask on the face and whether it was worn correctly, incorrectly, or no mask at all. The TinyYoloV3 was used for the live feed as well as for a comparison standpoint against the previous two-stage classifier and was trained using the darknet neural network framework. The two-stage classifier attained a mean average precision (MAP) of 80%, while the model trained using TinyYoloV3 real-time detection had a mean average precision (MAP) of 59%. Overall, both models were able to correctly classify stages/scenarios of no mask, mask, and incorrectly worn masks.Keywords: datasets, classifier, mask-detection, real-time, TinyYoloV3, two-stage neural network classifier
Procedia PDF Downloads 1617 Effectiveness of Simulation Resuscitation Training to Improve Self-Efficacy of Physicians and Nurses at Aga Khan University Hospital in Advanced Cardiac Life Support Courses Quasi-Experimental Study Design
Authors: Salima R. Rajwani, Tazeen Ali, Rubina Barolia, Yasmin Parpio, Nasreen Alwani, Salima B. Virani
Abstract:
Introduction: Nurses and physicians have a critical role in initiating lifesaving interventions during cardiac arrest. It is important that timely delivery of high quality Cardio Pulmonary Resuscitation (CPR) with advanced resuscitation skills and management of cardiac arrhythmias is a key dimension of code during cardiac arrest. It will decrease the chances of patient survival if the healthcare professionals are unable to initiate CPR timely. Moreover, traditional training will not prepare physicians and nurses at a competent level and their knowledge level declines over a period of time. In this regard, simulation training has been proven to be effective in promoting resuscitation skills. Simulation teaching learning strategy improves knowledge level, and skills performance during resuscitation through experiential learning without compromising patient safety in real clinical situations. The purpose of the study is to evaluate the effectiveness of simulation training in Advanced Cardiac Life Support Courses by using the selfefficacy tool. Methods: The study design is a quantitative research design and non-randomized quasi-experimental study design. The study examined the effectiveness of simulation through self-efficacy in two instructional methods; one is Medium Fidelity Simulation (MFS) and second is Traditional Training Method (TTM). The sample size was 220. Data was compiled by using the SPSS tool. The standardized simulation based training increases self-efficacy, knowledge, and skills and improves the management of patients in actual resuscitation. Results: 153 students participated in study; CG: n = 77 and EG: n = 77. The comparison was done between arms in pre and post-test. (F value was 1.69, p value is <0.195 and df was 1). There was no significant difference between arms in the pre and post-test. The interaction between arms was observed and there was no significant difference in interaction between arms in the pre and post-test. (F value was 0.298, p value is <0.586 and df is 1. However, the results showed self-efficacy scores were significantly higher within experimental group in post-test in advanced cardiac life support resuscitation courses as compared to Traditional Training Method (TTM) and had overall (p <0.0001) and F value was 143.316 (mean score was 45.01 and SD was 9.29) verses pre-test result showed (mean score was 31.15 and SD was 12.76) as compared to TTM in post-test (mean score was 29.68 and SD was 14.12) verses pre-test result showed (mean score was 42.33 and SD was 11.39). Conclusion: The standardized simulation-based training was conducted in the safe learning environment in Advanced Cardiac Life Suport Courses and physicians and nurses benefited from self-confidence, early identification of life-threatening scenarios, early initiation of CPR, and provides high-quality CPR, timely administration of medication and defibrillation, appropriate airway management, rhythm analysis and interpretation, and Return of Spontaneous Circulation (ROSC), team dynamics, debriefing, and teaching and learning strategies that will improve the patient survival in actual resuscitation.Keywords: advanced cardiac life support, cardio pulmonary resuscitation, return of spontaneous circulation, simulation
Procedia PDF Downloads 806 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 1865 Effect of Lithium Bromide Concentration on the Structure and Performance of Polyvinylidene Fluoride (PVDF) Membrane for Wastewater Treatment
Authors: Poojan Kothari, Yash Madhani, Chayan Jani, Bharti Saini
Abstract:
The requirements for quality drinking and industrial water are increasing and water resources are depleting. Moreover large amount of wastewater is being generated and dumped into water bodies without treatment. These have made improvement in water treatment efficiency and its reuse, an important agenda. Membrane technology for wastewater treatment is an advanced process and has become increasingly popular in past few decades. There are many traditional methods for tertiary treatment such as chemical coagulation, adsorption, etc. However recent developments in membrane technology field have led to manufacturing of better quality membranes at reduced costs. This along with the high costs of conventional treatment processes, high separation efficiency and relative simplicity of the membrane treatment process has made it an economically viable option for municipal and industrial purposes. Ultrafiltration polymeric membranes can be used for wastewater treatment and drinking water applications. The proposed work focuses on preparation of one such UF membrane - Polyvinylidene fluoride (PVDF) doped with LiBr for wastewater treatment. Majorly all polymeric membranes are hydrophobic in nature. This property leads to repulsion of water and hence solute particles occupy the pores, decreasing the lifetime of a membrane. Thus modification of membrane through addition of small amount of salt such as LiBr helped us attain certain characteristics of membrane, which can then be used for wastewater treatment. The membrane characteristics are investigated through measuring its various properties such as porosity, contact angle and wettability to find out the hydrophilic nature of the membrane and morphology (surface as well as structure). Pure water flux, solute rejection and permeability of membrane is determined by permeation experiments. A study of membrane characteristics with various concentration of LiBr helped us to compare its effectivity.Keywords: Lithium bromide (LiBr), morphology, permeability, Polyvinylidene fluoride (PVDF), solute rejection, wastewater treatment
Procedia PDF Downloads 1464 Advanced Technology for Natural Gas Liquids (NGL) Recovery Using Residue Gas Split
Authors: Riddhiman Sherlekar, Umang Paladia, Rachit Desai, Yash Patel
Abstract:
The competitive scenario of the oil and gas market is a challenge for today’s plant designers to achieve designs that meet client expectations with shrinking budgets, safety requirements, and operating flexibility. Natural Gas Liquids have three main industrial uses. They can be used as fuels, or as petrochemical feedstock or as refinery blends that can be further processed and sold as straight run cuts, such as naphtha, kerosene and gas oil. NGL extraction is not a chemical reaction. It involves the separation of heavier hydrocarbons from the main gas stream through pressure as temperature reduction, which depending upon the degree of NGL extraction may involve cryogenic process. Previous technologies i.e. short cycle dry desiccant absorption, Joule-Thompson or Low temperature refrigeration, lean oil absorption have been giving results of only 40 to 45% ethane recoveries, which were unsatisfying depending upon the current scenario of down turn market. Here new technology has been suggested for boosting up the recoveries of ethane+ up to 95% and up to 99% for propane+ components. Cryogenic plants provide reboiling to demethanizers by using part of inlet feed gas, or inlet feed split. If the two stream temperatures are not similar, there is lost work in the mixing operation unless the designer has access to some proprietary design. The concept introduced in this process consists of reboiling the demethanizer with the residue gas, or residue gas split. The innovation of this process is that it does not use the typical inlet gas feed split type of flow arrangement to reboil the demethanizer or deethanizer column, but instead uses an open heat pump scheme to that effect. The residue gas compressor provides the heat pump effect. The heat pump stream is then further cooled and entered in the top section of the column as a cold reflux. Because of the nature of this design, this process offers the opportunity to operate at full ethane rejection or recovery. The scheme is also very adaptable to revamp existing facilities. This advancement can be proven not only in enhancing the results but also provides operational flexibility, optimize heat exchange, introduces equipment cost reduction, opens a future for the innovative designs while keeping execution costs low.Keywords: deethanizer, demethanizer, residue gas, NGL
Procedia PDF Downloads 2653 Biomechanical Modeling, Simulation, and Comparison of Human Arm Motion to Mitigate Astronaut Task during Extra Vehicular Activity
Authors: B. Vadiraj, S. N. Omkar, B. Kapil Bharadwaj, Yash Vardhan Gupta
Abstract:
During manned exploration of space, missions will require astronaut crewmembers to perform Extra Vehicular Activities (EVAs) for a variety of tasks. These EVAs take place after long periods of operations in space, and in and around unique vehicles, space structures and systems. Considering the remoteness and time spans in which these vehicles will operate, EVA system operations should utilize common worksites, tools and procedures as much as possible to increase the efficiency of training and proficiency in operations. All of the preparations need to be carried out based on studies of astronaut motions. Until now, development and training activities associated with the planned EVAs in Russian and U.S. space programs have relied almost exclusively on physical simulators. These experimental tests are expensive and time consuming. During the past few years a strong increase has been observed in the use of computer simulations due to the fast developments in computer hardware and simulation software. Based on this idea, an effort to develop a computational simulation system to model human dynamic motion for EVA is initiated. This study focuses on the simulation of an astronaut moving the orbital replaceable units into the worksites or removing them from the worksites. Our physics-based methodology helps fill the gap in quantitative analysis of astronaut EVA by providing a multisegment human arm model. Simulation work described in the study improves on the realism of previous efforts, incorporating joint stops to account for the physiological limits of range of motion. To demonstrate the utility of this approach human arm model is simulated virtually using ADAMS/LifeMOD® software. Kinematic mechanism for the astronaut’s task is studied from joint angles and torques. Simulation results obtained is validated with numerical simulation based on the principles of Newton-Euler method. Torques determined using mathematical model are compared among the subjects to know the grace and consistency of the task performed. We conclude that due to uncertain nature of exploration-class EVA, a virtual model developed using multibody dynamics approach offers significant advantages over traditional human modeling approaches.Keywords: extra vehicular activity, biomechanics, inverse kinematics, human body modeling
Procedia PDF Downloads 3412 Detailed Analysis of Mechanism of Crude Oil and Surfactant Emulsion
Authors: Riddhiman Sherlekar, Umang Paladia, Rachit Desai, Yash Patel
Abstract:
A number of surfactants which exhibit ultra-low interfacial tension and an excellent microemulsion phase behavior with crude oils of low to medium gravity are not sufficiently soluble at optimum salinity to produce stable aqueous solutions. Such solutions often show phase separation after a few days at reservoir temperature, which does not suffice the purpose and the time is short when compared to the residence time in a reservoir for a surfactant flood. The addition of polymer often exacerbates the problem although the poor stability of the surfactant at high salinity remains a pivotal issue. Surfactants such as SDS, Ctab with large hydrophobes produce lowest IFT, but are often not sufficiently water soluble at desired salinity. Hydrophilic co-solvents and/or co-surfactants are needed to make the surfactant-polymer solution stable at the desired salinity. This study focuses on contrasting the effect of addition of a co-solvent in stability of a surfactant –oil emulsion. The idea is to use a co-surfactant to increase stability of an emulsion. Stability of the emulsion is enhanced because of creation of micro-emulsion which is verified both visually and with the help of particle size analyzer at varying concentration of salinity, surfactant and co-surfactant. A lab-experimental method description is provided and the method is described in detail to permit readers to emulate all results. The stability of the oil-water emulsion is visualized with respect to time, temperature, salinity of the brine and concentration of the surfactant. Nonionic surfactant TX-100 when used as a co-surfactant increases the stability of the oil-water emulsion. The stability of the prepared emulsion is checked by observing the particle size distribution. For stable emulsion in volume% vs particle size curve, the peak should be obtained for particle size of 5-50 nm while for the unstable emulsion a bigger sized particles are observed. The UV-Visible spectroscopy is also used to visualize the fraction of oil that plays important role in the formation of micelles in stable emulsion. This is important as the study will help us to decide applicability of the surfactant based EOR method for a reservoir that contains a specific type of crude. The use of nonionic surfactant as a co-surfactant would also increase the efficiency of surfactant EOR. With the decline in oil discoveries during the last decades it is believed that EOR technologies will play a key role to meet the energy demand in years to come. Taking this into consideration, the work focuses on the optimization of the secondary recovery(Water flooding) with the help of surfactant and/or co-surfactants by creating desired conditions in the reservoir.Keywords: co-surfactant, enhanced oil recovery, micro-emulsion, surfactant flooding
Procedia PDF Downloads 2501 Multi-Modality Brain Stimulation: A Treatment Protocol for Tinnitus
Authors: Prajakta Patil, Yash Huzurbazar, Abhijeet Shinde
Abstract:
Aim: To develop a treatment protocol for the management of tinnitus through multi-modality brain stimulation. Methodology: Present study included 33 adults with unilateral (31 subjects) and bilateral (2 subjects) chronic tinnitus with and/or without hearing loss independent of their etiology. The Treatment protocol included 5 consecutive sessions with follow-up of 6 months. Each session was divided into 3 parts: • Pre-treatment: a) Informed consent b) Pitch and loudness matching. • Treatment: Bimanual paper pen task with tinnitus masking for 30 minutes. • Post-treatment: a) Pitch and loudness matching b) Directive counseling and obtaining feedback. Paper-pen task is to be performed bimanually that included carrying out two different writing activities in different context. The level of difficulty of the activities was increased in successive sessions. Narrowband noise of a frequency same as that of tinnitus was presented at 10 dBSL of tinnitus for 30 minutes simultaneously in the ear with tinnitus. Result: The perception of tinnitus was no longer present in 4 subjects while in remaining subjects it reduced to an intensity that its perception no longer troubled them without causing residual facilitation. In all subjects, the intensity of tinnitus decreased by an extent of 45 dB at an average. However, in few subjects, the intensity of tinnitus also decreased by more than 45 dB. The approach resulted in statistically significant reductions in Tinnitus Functional Index and Tinnitus Handicap Inventory scores. The results correlate with pre and post treatment score of Tinnitus Handicap Inventory that dropped from 90% to 0%. Discussion: Brain mapping(qEEG) Studies report that there is multiple parallel overlapping of neural subnetworks in the non-auditory areas of the brain which exhibits abnormal, constant and spontaneous neural activity involved in the perception of tinnitus with each subnetwork and area reflecting a specific aspect of tinnitus percept. The paper pen task and directive counseling are designed and delivered respectively in a way that is assumed to induce normal, rhythmically constant and premeditated neural activity and mask the abnormal, constant and spontaneous neural activity in the above-mentioned subnetworks and the specific non-auditory area. Counseling was focused on breaking the vicious cycle causing and maintaining the presence of tinnitus. Diverting auditory attention alone is insufficient to reduce the perception of tinnitus. Conscious awareness of tinnitus can be suppressed when individuals engage in cognitively demanding tasks of non-auditory nature as the paper pen task used in the present study. To carry out this task selective, divided, sustained, simultaneous and split attention act cumulatively. Bimanual paper pen task represents a top-down activity which underlies brain’s ability to selectively attend to the bimanual written activity as a relevant stimulus and to ignore tinnitus that is the irrelevant stimuli in the present study. Conclusion: The study suggests that this novel treatment approach is cost effective, time saving and efficient to vanish the tinnitus or to reduce the intensity of tinnitus to a negligible level and thereby eliminating the negative reactions towards tinnitus.Keywords: multi-modality brain stimulation, neural subnetworks, non-auditory areas, paper-pen task, top-down activity
Procedia PDF Downloads 147