Search results for: application delivery
1681 Silver Nanoparticles Synthesized in Plant Extract Against Acute Hepatopancreatic Necrosis of Shrimp: Estimated By Multiple Models
Authors: Luz del Carmen Rubí Félix Peña, Jose Adan Felix-Ortiz, Ely Sara Lopez-Alvarez, Wenceslao Valenzuela-Quiñonez
Abstract:
On a global scale, Mexico is the sixth largest producer of farmed white shrimp (Penaeus vannamei). The activity suffered significant economic losses due to acute hepatopancreatic necrosis (AHPND) caused by a strain of Vibrio parahaemolyticus. For control, the first option is the application of antibiotics in food, causing changes in the environment and bacterial communities, which has produced greater virulence and resistance of pathogenic bacteria. An alternative treatment is silver nanoparticles (AgNPs) generated by green synthesis, which have shown an antibacterial capacity by destroying the cell membrane or denaturing the cell. However, the doses at which these are effective are still unknown. The aim is to calculate the minimum inhibitory concentration (MIC) using the Gompertz, Richard, and Logistic model of biosynthesized AgNPs against a strain of V. parahaemolyticus. Through the testing of different formulations of AgNPs synthesized from Euphorbia prostrate (Ep) extracts against V. parahaemolyticus causing AHPND in white shrimp. Aqueous and ethanol extracts were obtained, and the concentration of phenols and flavonoids was quantified. In the antibiograms, AgNPs were formulated in ethanol extracts of Ep (20 and 30%). The inhibition halo at well dilution test were 18±1.7 and 17.67±2.1 mm against V. parahaemolyticus. A broth microdilution was performed with the inhibitory agents (aqueous and ethanolic extracts and AgNPs) and 20 μL of the inoculum of V. parahaemolyticus. The MIC for AgNPs was 6.2-9.3 μg/mL and for ethanol extract of 49-73 mg/mL. The Akaike index (AIC) was used to choose the Gompertz model for ethanol extracts of Ep as the best data descriptor (AIC=204.8, 10%; 45.5, 20%, and 204.8, 30%). The Richards model was at AgNPs ethanol extract with AIC=-9.3 (10%), -17.5 (20 and 30%). The MIC calculated for EP extracts with the modified Gompertz model were 20 mg/mL (10% and 20% extract) and 40 mg/mL at 30%, while Richard was winner for AgNPs-synthesized it was 5 μg/mL (10% and 20%) and 8 μg/mL (30%). The solver tool Excel was used for the calculations of the models and inhibition curves against V.parahaemolyticus.Keywords: green synthesis, euphorbia prostata, phenols, flavonoids, bactericide
Procedia PDF Downloads 1071680 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability
Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto
Abstract:
Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT
Procedia PDF Downloads 7951679 Motor Control Recovery Minigame
Authors: Taha Enes Kon, Vanshika Reddy
Abstract:
This project focuses on developing a gamified mobile application to aid in stroke rehabilitation by enhancing motor skills through interactive activities. The primary goal was to design a companion app for a passive haptic rehab glove, incorporating Google MediaPipe for gesture tracking and vibrotactile feedback. The app simulates farming activities, offering a fun and engaging experience while addressing the monotony of traditional rehabilitation methods. The prototype focuses on a single minigame, Flower Picking, which uses gesture recognition to interact with virtual elements, encouraging users to perform exercises that improve hand dexterity. The development process involved creating accessible and user-centered designs using Figma, integrating gesture recognition algorithms, and implementing unity-based game mechanics. Real-time feedback and progressive difficulty levels ensured a personalized experience, motivating users to adhere to rehabilitation routines. The prototype achieved a gesture detection precision of 90%, effectively recognizing predefined gestures such as the Fist and OK symbols. Quantitative analysis highlighted a 40% increase in average session duration compared to traditional exercises, while qualitative feedback praised the app’s immersive design and ease of use. Despite its success, challenges included rigidity in gesture recognition, requiring precise hand orientations, and limited gesture support. Future improvements include expanding gesture adaptability and incorporating additional minigames to target a broader range of exercises. The project demonstrates the potential of gamification in stroke rehabilitation, offering a scalable and accessible solution that complements clinical treatments, making recovery engaging and effective for users.Keywords: stroke rehabilitation, haptic feedback, gamification, MediaPipe, motor control
Procedia PDF Downloads 71678 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1301677 Exploring the Design of Prospective Human Immunodeficiency Virus Type 1 Reverse Transcriptase Inhibitors through a Comprehensive Approach of Quantitative Structure Activity Relationship Study, Molecular Docking, and Molecular Dynamics Simulations
Authors: Mouna Baassi, Mohamed Moussaoui, Sanchaita Rajkhowa, Hatim Soufi, Said Belaaouad
Abstract:
The objective of this paper is to address the challenging task of targeting Human Immunodeficiency Virus type 1 Reverse Transcriptase (HIV-1 RT) in the treatment of AIDS. Reverse Transcriptase inhibitors (RTIs) have limitations due to the development of Reverse Transcriptase mutations that lead to treatment resistance. In this study, a combination of statistical analysis and bioinformatics tools was adopted to develop a mathematical model that relates the structure of compounds to their inhibitory activities against HIV-1 Reverse Transcriptase. Our approach was based on a series of compounds recognized for their HIV-1 RT enzymatic inhibitory activities. These compounds were designed via software, with their descriptors computed using multiple tools. The most statistically promising model was chosen, and its domain of application was ascertained. Furthermore, compounds exhibiting comparable biological activity to existing drugs were identified as potential inhibitors of HIV-1 RT. The compounds underwent evaluation based on their chemical absorption, distribution, metabolism, excretion, toxicity properties, and adherence to Lipinski's rule. Molecular docking techniques were employed to examine the interaction between the Reverse Transcriptase (Wild Type and Mutant Type) and the ligands, including a known drug available in the market. Molecular dynamics simulations were also conducted to assess the stability of the RT-ligand complexes. Our results reveal some of the new compounds as promising candidates for effectively inhibiting HIV-1 Reverse Transcriptase, matching the potency of the established drug. This necessitates further experimental validation. This study, beyond its immediate results, provides a methodological foundation for future endeavors aiming to discover and design new inhibitors targeting HIV-1 Reverse Transcriptase.Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation, reverse transcriptase inhibitors, HIV type 1
Procedia PDF Downloads 931676 Developing a Thermo-Sensitive Conductive Stretchable Film to Allow Cell Sheet Harvest after Mechanical and Electrical Treatments
Authors: Wei-Wen Hu, Yong-Zhi Zhong
Abstract:
Depositing conductive polypyrrole (PPy) onto elastic polydimethylsiloxane (PDMS) substrate can obtain a highly stretchable conductive film, which can be used to construct a bioreactor to cyclically stretch and electrically stimulate surface cells. However, how to completely harvest these stimulated muscle tissue to repair damaged muscle is a challenge. To address this concern, N-isopropylacrylamide (NIPAAm), a monomer of temperature-sensitive polymer, was added during the polymerization of pyrrole on PDMS so that the resulting P(Py-co-NIPAAm)/PDMS should own both conductivity and thermo-sensitivity. Therefore, cells after stimulation can be completely harvested as cell sheets by reducing temperature. Mouse skeletal myoblast, C2C12 cells, were applied to examine our hypothesis. In electrical stimulation, C2C12 cells on P(Py-co-NIPAAm)/PDMS demonstrated the best myo-differentiation under the electric field of 1 V/cm. Regarding cyclic stretching, the strain equal to or higher than 9% can highly align C2C12 perpendicular to the stretching direction. The Western blotting experiments demonstrated that the cell sheets harvested by cooling reserved more extracellular matrix (ECM) than cells collected by the traditional trypsin digestion method. Immunostaining of myosin heavy chain protein (MHC) indicated that both mechanical and electrical stimuli effectively increased the number of myotubes and the differentiation ratio, and the myotubes can be aligned by cyclic stretching. Stimulated cell sheets can be harvested by cooling, and the alignment of myotubes was still maintained. These results suggested that the deposition of P(Py-co-NIPAAm) on PDMS can be applied to harvest intact cell sheets after cyclic stretching and electrical stimulation, which increased the feasibility of bioreactor for the application of tissue engineering and regenerative medicine.Keywords: bioreactor, cell sheet, conductive polymer, cyclic stretching, electrical stimulation, muscle tissue engineering, myogenesis, thermosensitive hydrophobicity
Procedia PDF Downloads 971675 Screening of Plant Growth Promoting Rhizobacteria in the Rhizo- and Endosphere of Sunflower (Helianthus anus) and Their Role in Enhancing Growth and Yield Attriburing Trairs and Colonization Studies
Authors: A. Majeed, M.K. Abbasi, S. Hameed, A. Imran, T. Naqqash, M. K. Hanif
Abstract:
Plant growth-promoting rhizobacteria (PGPR) are free-living soil bacteria that aggressively colonize the rhizosphere/plant roots, and enhance the growth and yield of plants when applied to seed or crops. Root associated (endophytic and rhizospheric) PGPR were isolated from Sunflower (Helianthus anus) grown in soils collected from 16 different sites of sub division Dhirkot, Poonch, Azad Jammu & Kashmir, Pakistan. A total of 150 bacterial isolates were isolated, purified, screened in vitro for their plant growth promoting (PGP) characteristics. 11 most effective isolates were selected on the basis of biochemical assays (nitrogen fixation, phosphate solubilization, growth hormone production, biocontrol assay, and carbon substrates utilization assay through gas chromatography (GCMS), spectrophotometry, high performance liquid chromatography HPLC, fungal and bacterial dual plate assay and BIOLOG GN2/GP2 microplate assay respectively) and were tested on the crop under controlled and field conditions. From the inoculation assay, the most promising 4 strains (on the basis of increased root/shoot weight, root/shoot length, seed oil content, and seed yield) were than selected for colonization studies through confocal laser scanning and transmission electron microscope. 16Sr RNA gene analysis showed that these bacterial isolates belong to Pseudononas, Enterobacter, Azospirrilum, and Citobacter genera. This study is the clear evident that such isolates have the potential for application as inoculants adapted to poor soils and local crops to minimize the chemical fertilizers harmful for soil and environmentKeywords: PGPR, nitrogen fixation, phosphate solubilization, colonization
Procedia PDF Downloads 3421674 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process
Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani
Abstract:
Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process
Procedia PDF Downloads 3401673 A Low-Cost of Foot Plantar Shoes for Gait Analysis
Authors: Zulkifli Ahmad, Mohd Razlan Azizan, Nasrul Hadi Johari
Abstract:
This paper presents a study on development and conducting of a wearable sensor system for gait analysis measurement. For validation, the method of plantar surface measurement by force plate was prepared. In general gait analysis, force plate generally represents a studies about barefoot in whole steps and do not allow analysis of repeating movement step in normal walking and running. The measurements that were usually perform do not represent the whole daily plantar pressures in the shoe insole and only obtain the ground reaction force. The force plate measurement is usually limited a few step and it is done indoor and obtaining coupling information from both feet during walking is not easily obtained. Nowadays, in order to measure pressure for a large number of steps and obtain pressure in each insole part, it could be done by placing sensors within an insole. With this method, it will provide a method for determine the plantar pressures while standing, walking or running of a shoe wearing subject. Inserting pressure sensors in the insole will provide specific information and therefore the point of the sensor placement will result in obtaining the critical part under the insole. In the wearable shoe sensor project, the device consists left and right shoe insole with ten FSR. Arduino Mega was used as a micro-controller that read the analog input from FSR. The analog inputs were transmitted via bluetooth data transmission that gains the force data in real time on smartphone. Blueterm software which is an android application was used as an interface to read the FSR reading on the shoe wearing subject. The subject consist of two healthy men with different age and weight doing test while standing, walking (1.5 m/s), jogging (5 m/s) and running (9 m/s) on treadmill. The data obtain will be saved on the android device and for making an analysis and comparison graph.Keywords: gait analysis, plantar pressure, force plate, earable sensor
Procedia PDF Downloads 4541672 Paraplegic Dimensions of Asymmetric Warfare: A Strategic Analysis for Resilience Policy Plan
Authors: Sehrish Qayyum
Abstract:
In this age of constant technology, asymmetrical warfare could not be won. Attuned psychometric study confirms that screaming sometimes is more productive than active retaliation against strong adversaries. Asymmetric warfare is a game of nerves and thoughts with least vigorous participation for large anticipated losses. It creates the condition of paraplegia with partial but permanent immobility, which effects the core warfare operations, being screams rather than active retaliation. When one’s own power is doubted, it gives power to one’s own doubt to ruin all planning either done with superlative cost-benefit analysis. Strategically calculated estimation of asymmetric warfare since the early WWI to WWII, WWII-to Cold War, and then to the current era in three chronological periods exposits that courage makes nations win the battle of warriors to battle of comrades. Asymmetric warfare has been most difficult to fight and survive due to unexpectedness and being lethal despite preparations. Thoughts before action may be the best-assumed strategy to mix Regional Security Complex Theory and OODA loop to develop the Paraplegic Resilience Policy Plan (PRPP) to win asymmetric warfare. PRPP may serve to control and halt the ongoing wave of terrorism, guerilla warfare, and insurgencies, etc. PRPP, along with a strategic work plan, is based on psychometric analysis to deal with any possible war condition and tactic to save millions of innocent lives such that lost in Christchurch New Zealand in 2019, November 2015 Paris attacks, and Berlin market attacks in 2016, etc. Getting tangled into self-imposed epistemic dilemmas results in regret that becomes the only option of performance. It is a descriptive psychometric analysis of war conditions with generic application of probability tests to find the best possible options and conditions to develop PRPP for any adverse condition possible so far. Innovation in technology begets innovation in planning and action-plan to serve as a rheostat approach to deal with asymmetric warfare.Keywords: asymmetric warfare, psychometric analysis, PRPP, security
Procedia PDF Downloads 1371671 Applying Multiple Kinect on the Development of a Rapid 3D Mannequin Scan Platform
Authors: Shih-Wen Hsiao, Yi-Cheng Tsao
Abstract:
In the field of reverse engineering and creative industries, applying 3D scanning process to obtain geometric forms of the objects is a mature and common technique. For instance, organic objects such as faces and nonorganic objects such as products could be scanned to acquire the geometric information for further application. However, although the data resolution of 3D scanning device is increasing and there are more and more abundant complementary applications, the penetration rate of 3D scanning for the public is still limited by the relative high price of the devices. On the other hand, Kinect, released by Microsoft, is known for its powerful functions, considerably low price, and complete technology and database support. Therefore, related studies can be done with the applying of Kinect under acceptable cost and data precision. Due to the fact that Kinect utilizes optical mechanism to extracting depth information, limitations are found due to the reason of the straight path of the light. Thus, various angles are required sequentially to obtain the complete 3D information of the object when applying a single Kinect for 3D scanning. The integration process which combines the 3D data from different angles by certain algorithms is also required. This sequential scanning process costs much time and the complex integration process often encounter some technical problems. Therefore, this paper aimed to apply multiple Kinects simultaneously on the field of developing a rapid 3D mannequin scan platform and proposed suggestions on the number and angles of Kinects. In the content, a method of establishing the coordination based on the relation between mannequin and the specifications of Kinect is proposed, and a suggestion of angles and number of Kinects is also described. An experiment of applying multiple Kinect on the scanning of 3D mannequin is constructed by Microsoft API, and the results show that the time required for scanning and technical threshold can be reduced in the industries of fashion and garment design.Keywords: 3D scan, depth sensor, fashion and garment design, mannequin, multiple Kinect sensor
Procedia PDF Downloads 3681670 Waste Utilization by Combustion in the Composition of Gel Fuels
Authors: Dmitrii Glushkov, Aleksandr G. Nigay, Olga S. Yashutina
Abstract:
In recent years, due to the intensive development of the Arctic and Antarctic areas, the actual task is to develop technology for the effective utilization of solid and liquid combustible wastes in an environment with low temperatures. Firstly, such technology will help to prevent the dumping of waste into the World Ocean and reduce the risks of causing environmental damage to the Far North areas. Secondly, promising actions will help to prepare fuel compositions from the waste in the places of their production. Such kind of fuels can be used as energy resources. It will reduce waste utilization costs when transporting them to the mainland. In the present study, we suggest a solution to the problem of waste utilization by the preparation of gel fuels based on solid and liquid combustible components with the addition of the thickener. Such kind of fuels is characterized by ease of preparation, storage, transportation and use (as energy resources). The main regularities and characteristics of physical and chemical processes are established with varying parameters of gel fuels and heating sources in wide ranges. The obtained results let us conclude about the prospects of gel fuels practical application for combustible wastes utilization. Appropriate technology will be characterized by positive environmental, operational and economic effects. The composition of the gel fuels can vary in a wide range. The fuels preparation based on one type of a combustible liquid or a several liquids mixture with the finely dispersed components addition makes it possible to obtain compositions with predicted rheological, energy or environmental characteristics. Besides, gel fuels have a lower level of the fire hazard compared to common solid and liquid fuels. This makes them convenient for storage and transportation. In such conditions, it is not necessary to transport combustible wastes from the territory of the Arctic and the Antarctic to the mainland for processing, which is now quite an expensive procedure. The research was funded by the Russian Science Foundation (project No. 18-13-00031).Keywords: combustible liquid waste, gel fuel, ignition and combustion, utilization
Procedia PDF Downloads 1231669 Examining the Contemporary Relevance of Mahatma Gandhi’s Thought: A Bulwark against Terrorism
Authors: Jayita Mukhopadhyay
Abstract:
Even though more than six decades has passed since the death of India’s iconic thinker and mass leader Mahatma Gandhi, the world besieged by terrorism may still take a leaf out of his philosophical discourse on non-violence and attempt to turn his theory into praxis to save mankind. The greatest soul world has ever produced, a man of divine fire, an apostle of peace and non-violence, a revolutionary, a visionary, a social reformer and deliverer of the downtrodden, Father of the nation, these and numerous other epithets have been used by eminent personalities and scholars while describing Mahatma Gandhi. Gandhi was a relentless fighter and mass mobiliser who awakened a sleeping giant, the common men and women of India, shook them out of their docile, fatalistic mould, invigorated them with his doctrine of ahimsa and satyagraha (non violence and strict adherence to truth), instilled in them nationalist zeal and patriotic fervour and turned them into determined, steadfast freedom fighters. Under his leadership, the national liberation movement got a new life and ultimately succeeded in ending the era of foreign domination. And he did all these while resisting a natural tendency of his people to respond violently to unspeakable violence and atrocities unleashed by the colonial British administration desperate to keep India in its empire. In this paper, an attempt will be made to unravel Gandhi’s elucidation of the concept of non-violent resistance, along with non-cooperation and civil disobedience and their actual application through political practices which succeeded in capturing the imagination of not only India’s teeming millions but the entire world. The methodology of analytical study will be used as Gandhi’s own writings and those by noted scholars on Gandhi will be examined extensively to establish contemporary relevance of his thought, his invaluable guidelines about how to cope with poverty, inequality, exploitation, repression and marginalization of some sections of society and resultant radicalization of some disturbed members of human race, the very conditions which spawn terrorism in today’s world.Keywords: India, non cooperation, non violence, terrorism
Procedia PDF Downloads 3261668 Development of an Experiment for Impedance Measurement of Structured Sandwich Sheet Metals by Using a Full Factorial Multi-Stage Approach
Authors: Florian Vincent Haase, Adrian Dierl, Anna Henke, Ralf Woll, Ennes Sarradj
Abstract:
Structured sheet metals and structured sandwich sheet metals are three-dimensional, lightweight structures with increased stiffness which are used in the automotive industry. The impedance, a figure of resistance of a structure to vibrations, will be determined regarding plain sheets, structured sheets, and structured sandwich sheets. The aim of this paper is generating an experimental design in order to minimize costs and duration of experiments. The design of experiments will be used to reduce the large number of single tests required for the determination of correlation between the impedance and its influencing factors. Full and fractional factorials are applied in order to systematize and plan the experiments. Their major advantages are high quality results given the relatively small number of trials and their ability to determine the most important influencing factors including their specific interactions. The developed full factorial experimental design for the study of plain sheets includes three factor levels. In contrast to the study of plain sheets, the respective impedance analysis used on structured sheets and structured sandwich sheets should be split into three phases. The first phase consists of preliminary tests which identify relevant factor levels. These factor levels are subsequently employed in main tests, which have the objective of identifying complex relationships between the parameters and the reference variable. Possible post-tests can follow up in case additional study of factor levels or other factors are necessary. By using full and fractional factorial experimental designs, the required number of tests is reduced by half. In the context of this paper, the benefits from the application of design for experiments are presented. Furthermore, a multistage approach is shown to take into account unrealizable factor combinations and minimize experiments.Keywords: structured sheet metals, structured sandwich sheet metals, impedance measurement, design of experiment
Procedia PDF Downloads 3751667 Syntax and Words as Evolutionary Characters in Comparative Linguistics
Authors: Nancy Retzlaff, Sarah J. Berkemer, Trudie Strauss
Abstract:
In the last couple of decades, the advent of digitalization of any kind of data was probably one of the major advances in all fields of study. This paves the way for also analysing these data even though they might come from disciplines where there was no initial computational necessity to do so. Especially in linguistics, one can find a rather manual tradition. Still when considering studies that involve the history of language families it is hard to overlook the striking similarities to bioinformatics (phylogenetic) approaches. Alignments of words are such a fairly well studied example of an application of bioinformatics methods to historical linguistics. In this paper we will not only consider alignments of strings, i.e., words in this case, but also alignments of syntax trees of selected Indo-European languages. Based on initial, crude alignments, a sophisticated scoring model is trained on both letters and syntactic features. The aim is to gain a better understanding on which features in two languages are related, i.e., most likely to have the same root. Initially, all words in two languages are pre-aligned with a basic scoring model that primarily selects consonants and adjusts them before fitting in the vowels. Mixture models are subsequently used to filter ‘good’ alignments depending on the alignment length and the number of inserted gaps. Using these selected word alignments it is possible to perform tree alignments of the given syntax trees and consequently find sentences that correspond rather well to each other across languages. The syntax alignments are then filtered for meaningful scores—’good’ scores contain evolutionary information and are therefore used to train the sophisticated scoring model. Further iterations of alignments and training steps are performed until the scoring model saturates, i.e., barely changes anymore. A better evaluation of the trained scoring model and its function in containing evolutionary meaningful information will be given. An assessment of sentence alignment compared to possible phrase structure will also be provided. The method described here may have its flaws because of limited prior information. This, however, may offer a good starting point to study languages where only little prior knowledge is available and a detailed, unbiased study is needed.Keywords: alignments, bioinformatics, comparative linguistics, historical linguistics, statistical methods
Procedia PDF Downloads 1541666 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast
Authors: Scott Nielsen, Luca Longanesi, Chris Chuck
Abstract:
Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis
Procedia PDF Downloads 2061665 Banking Union: A New Step towards Completing the Economic and Monetary Union
Authors: Marijana Ivanov, Roman Šubić
Abstract:
The single rulebook together with the Single Supervisory Mechanism and the Single Resolution Mechanism - as two main pillars of the banking union, represent important steps towards completing the Economic and Monetary Union. It should provide a consistent application of common rules and administrative standards for supervision, recovery and resolution of banks – with the final aim that a former practice of the bail-out is replaced with the bail-in system through which bank failures will be resolved by their own funds, i.e. with minimal costs for taxpayers and real economy. It has to reduce the financial fragmentation recorded in the years of crisis as the result of divergent behaviors in risk premium, lending activities, and interest rates between the core and the periphery. In addition, it should strengthen the effectiveness of monetary transmission channels, in particular the credit channels and overflows of liquidity on the single interbank money market. However, contrary to all the positive expectations related to the future functioning of the banking union, low and unbalanced economic growth rates remain a challenge for the maintenance of financial stability in the euro area, and this problem cannot be resolved just by a single supervision. In many countries bank assets exceed their GDP by several times, and large banks are still a matter of concern because of their systemic importance for individual countries and the euro zone as a whole. The creation of the SSM and the SRM should increase transparency of the banking system in the euro area and restore confidence that have been disturbed during the depression. It would provide a new opportunity to strengthen economic and financial systems in the peripheral countries. On the other hand, there is a potential threat that future focus of the ECB, resolution mechanism and other relevant institutions will be extremely oriented to the large and significant banks (whereby one half of them operate in the core and most important euro area countries), while it is questionable to what extent the common resolution funds will be used for rescue of less important institutions.Keywords: banking union, financial integration, single supervision mechanism (SSM)
Procedia PDF Downloads 4721664 The Effect of Multiple Environmental Conditions on Acacia senegal Seedling’s Carbon, Nitrogen, and Hydrogen Contents: An Experimental Investigation
Authors: Abdelmoniem A. Attaelmanan, Ahmed A. H. Siddig
Abstract:
This study was conducted in light of continual global climate changes that projected increasing aridity, changes in soil fertility, and pollution. Plant growth and development largely depend on the combination of availing water and nutrients in the soil. Changes in the climate and atmospheric chemistry can cause serious effects on these growth factors. Plant carbon (C), nitrogen (N), and hydrogen (H) play a fundamental role in the maintenance of ecosystem structure and function. Hashab (Acacia senegal), which produces gum Arabic, supports dryland ecosystems in tropical zones by its potentiality to restore degraded soils; hence it is ecologically and economically important for the dry areas of sub-Saharan Africa. The study aims at investigating the effects of water stress (simulated drought) and poor soil type on Acacia senegal C, N, and H contents. Seven days old seedlings were assigned to the treatments in Split- plot design for four weeks. The main plot is irrigation interval (well-watered and water-stressed), and the subplot is soil types (silt and sand soils). Seedling's C%, N%, and H% were measured using CHNS-O Analyzer and applying Standard Test Method. Irrigation intervals and soil types had no effects on seedlings and leaves C%, N%, and H%, irrigation interval had affected stem C and H%, both irrigation intervals and soil types had affected root N% and interaction effect of water and soil was found on leaves and root's N%. Synthesis application of well-watered irrigation with soil that is rich in N and other nutrients would result in the greatest seedling C, N, and H content which will enhance growth and biomass accumulation and can play a crucial role in ecosystem productivity and services in the dryland regions.Keywords: Acacia senegal, Africa, climate change, drylands, nutrients biomass, Sub-Saharan, Sudan
Procedia PDF Downloads 1181663 Biomolecules Based Microarray for Screening Human Endothelial Cells Behavior
Authors: Adel Dalilottojari, Bahman Delalat, Frances J. Harding, Michaelia P. Cockshell, Claudine S. Bonder, Nicolas H. Voelcker
Abstract:
Endothelial Progenitor Cell (EPC) based therapies continue to be of interest to treat ischemic events based on their proven role to promote blood vessel formation and thus tissue re-vascularisation. Current strategies for the production of clinical-grade EPCs requires the in vitro isolation of EPCs from peripheral blood followed by cell expansion to provide sufficient quantities EPCs for cell therapy. This study aims to examine the use of different biomolecules to significantly improve the current strategy of EPC capture and expansion on collagen type I (Col I). In this study, four different biomolecules were immobilised on a surface and then investigated for their capacity to support EPC capture and proliferation. First, a cell microarray platform was fabricated by coating a glass surface with epoxy functional allyl glycidyl ether plasma polymer (AGEpp) to mediate biomolecule binding. The four candidate biomolecules tested were Col I, collagen type II (Col II), collagen type IV (Col IV) and vascular endothelial growth factor A (VEGF-A), which were arrayed on the epoxy-functionalised surface using a non-contact printer. The surrounding area between the printed biomolecules was passivated with polyethylene glycol-bisamine (A-PEG) to prevent non-specific cell attachment. EPCs were seeded onto the microarray platform and cell numbers quantified after 1 h (to determine capture) and 72 h (to determine proliferation). All of the extracellular matrix (ECM) biomolecules printed demonstrated an ability to capture EPCs within 1 h of cell seeding with Col II exhibiting the highest level of attachment when compared to the other biomolecules. Interestingly, Col IV exhibited the highest increase in EPC expansion after 72 h when compared to Col I, Col II and VEGF-A. These results provide information for significant improvement in the capture and expansion of human EPC for further application.Keywords: biomolecules, cell microarray platform, cell therapy, endothelial progenitor cells, high throughput screening
Procedia PDF Downloads 2921662 Multi-Stage Optimization of Local Environmental Quality by Comprehensive Computer Simulated Person as Sensor for Air Conditioning Control
Authors: Sung-Jun Yoo, Kazuhide Ito
Abstract:
In this study, a comprehensive computer simulated person (CSP) that integrates computational human model (virtual manikin) and respiratory tract model (virtual airway), was applied for estimation of indoor environmental quality. Moreover, an inclusive prediction method was established by integrating computational fluid dynamics (CFD) analysis with advanced CSP which is combined with physiologically-based pharmacokinetic (PBPK) model, unsteady thermoregulation model for analysis targeting micro-climate around human body and respiratory area with high accuracy. This comprehensive method can estimate not only the contaminant inhalation but also constant interaction in the contaminant transfer between indoor spaces, i.e., a target area for indoor air quality (IAQ) assessment, and respiratory zone for health risk assessment. This study focused on the usage of the CSP as an air/thermal quality sensor in indoors, which means the application of comprehensive model for assessment of IAQ and thermal environmental quality. Demonstrative analysis was performed in order to examine the applicability of the comprehensive model to the heating, ventilation, air conditioning (HVAC) control scheme. CSP was located at the center of the simple model room which has dimension of 3m×3m×3m. Formaldehyde which is generated from floor material was assumed as a target contaminant, and flow field, sensible/latent heat and contaminant transfer analysis in indoor space were conducted by using CFD simulation coupled with CSP. In this analysis, thermal comfort was evaluated by thermoregulatory analysis, and respiratory exposure risks represented by adsorption flux/concentration at airway wall surface were estimated by PBPK-CFD hybrid analysis. These Analysis results concerning IAQ and thermal comfort will be fed back to the HVAC control and could be used to find a suitable ventilation rate and energy requirement for air conditioning system.Keywords: CFD simulation, computer simulated person, HVAC control, indoor environmental quality
Procedia PDF Downloads 3611661 The Effect of Discontinued Water Spray Cooling on the Heat Transfer Coefficient
Authors: J. Hrabovský, M. Chabičovský, J. Horský
Abstract:
Water spray cooling is a technique typically used in heat treatment and other metallurgical processes where controlled temperature regimes are required. Water spray cooling is used in static (without movement) or dynamic (with movement of the steel plate) regimes. The static regime is notable for the fixed position of the hot steel plate and fixed spray nozzle. This regime is typical for quenching systems focused on heat treatment of the steel plate. The second application of spray cooling is the dynamic regime. The dynamic regime is notable for its static section cooling system and moving steel plate. This regime is used in rolling and finishing mills. The fixed position of cooling sections with nozzles and the movement of the steel plate produce nonhomogeneous water distribution on the steel plate. The length of cooling sections and placement of water nozzles in combination with the nonhomogeneity of water distribution leads to discontinued or interrupted cooling conditions. The impact of static and dynamic regimes on cooling intensity and the heat transfer coefficient during the cooling process of steel plates is an important issue. Heat treatment of steel is accompanied by oxide scale growth. The oxide scale layers can significantly modify the cooling properties and intensity during the cooling. The combination of the static and dynamic (section) regimes with the variable thickness of the oxide scale layer on the steel surface impact the final cooling intensity. The study of the influence of the oxide scale layers with different cooling regimes was carried out using experimental measurements and numerical analysis. The experimental measurements compared both types of cooling regimes and the cooling of scale-free surfaces and oxidized surfaces. A numerical analysis was prepared to simulate the cooling process with different conditions of the section and samples with different oxide scale layers.Keywords: heat transfer coefficient, numerical analysis, oxide layer, spray cooling
Procedia PDF Downloads 4101660 Factors Affecting Visual Environment in Mine Lighting
Authors: N. Lakshmipathy, Ch. S. N. Murthy, M. Aruna
Abstract:
The design of lighting systems for surface mines is not an easy task because of the unique environment and work procedures encountered in the mines. The primary objective of this paper is to identify the major problems encountered in mine lighting application and to provide guidance in the solution of these problems. In the surface mining reflectance of surrounding surfaces is one of the important factors, which improve the vision, in the night hours. But due to typical working nature in the mines it is very difficult to fulfill these requirements, and also the orientation of the light at work site is a challenging task. Due to this reason machine operator and other workers in a mine need to be able to orient themselves in a difficult visual environment. The haul roads always keep on changing to tune with the mining activity. Other critical area such as dumpyards, stackyards etc. also change their phase with time, and it is difficult to illuminate such areas. Mining is a hazardous occupation, with workers exposed to adverse conditions; apart from the need for hard physical labor, there is exposure to stress and environmental pollutants like dust, noise, heat, vibration, poor illumination, radiation, etc. Visibility is restricted when operating load haul dumper and Heavy Earth Moving Machinery (HEMM) vehicles resulting in a number of serious accidents. one of the leading causes of these accidents is the inability of the equipment operator to see clearly people, objects or hazards around the machine. Results indicate blind spots are caused primarily by posts, the back of the operator's cab, and by lights and light brackets. The careful designed and implemented, lighting systems provide mine workers improved visibility and contribute to improved safety, productivity and morale. Properly designed lighting systems can improve visibility and safety during working in the opencast mines.Keywords: contrast, efficacy, illuminance, illumination, light, luminaire, luminance, reflectance, visibility
Procedia PDF Downloads 3601659 Developing Indicators in System Mapping Process Through Science-Based Visual Tools
Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski
Abstract:
The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.Keywords: indicators, knowledge management, system mapping, visual tools
Procedia PDF Downloads 1951658 Flow-Control Effectiveness of Convergent Surface Indentations on an Aerofoil at Low Reynolds Numbers
Authors: Neel K. Shah
Abstract:
Passive flow control on aerofoils has largely been achieved through the use of protrusions such as vane-type vortex generators. Consequently, innovative flow-control concepts should be explored in an effort to improve current component performance. Therefore, experimental research has been performed at The University of Manchester to evaluate the flow-control effectiveness of a vortex generator made in the form of a surface indentation. The surface indentation has a trapezoidal planform. A spanwise array of indentations has been applied in a convergent orientation around the maximum-thickness location of the upper surface of a NACA-0015 aerofoil. The aerofoil has been tested in a two-dimensional set-up in a low-speed wind tunnel at an angle of attack (AoA) of 3° and a chord-based Reynolds number (Re) of ~2.7 x 105. The baseline model has been found to suffer from a laminar separation bubble at low AoA. The application of the indentations at 3° AoA has considerably shortened the separation bubble. The indentations achieve this by shedding up-flow pairs of streamwise vortices. Despite the considerable reduction in bubble length, the increase in leading-edge suction due to the shorter bubble is limited by the removal of surface curvature and blockage (increase in surface pressure) caused locally by the convergent indentations. Furthermore, the up-flow region of the vortices, which locally weakens the pressure recovery around the trailing edge of the aerofoil by thickening the boundary layer, also contributes to this limitation. Due to the conflicting effects of the indentations, the changes in the pressure-lift and pressure-drag coefficients, i.e., cl,p and cd,p, are small. Nevertheless, the indentations have improved cl,p and cd,p beyond the uncertainty range, i.e., by ~1.30% and ~0.30%, respectively, at 3° AoA. The wake measurements show that turbulence intensity and Reynolds stresses have considerably increased in the indented case, thus implying that the indentations increase the viscous drag on the model. In summary, the convergent indentations are able to reduce the size of the laminar separation bubble, but conversely, they are not highly effective in reducing cd,p at the tested Reynolds number.Keywords: aerofoil flow control, laminar separation bubbles, low Reynolds-number flows, surface indentations
Procedia PDF Downloads 2281657 Competitive Advantage Challenges in the Apparel Manufacturing Industries of South Africa: Application of Porter’s Factor Conditions
Authors: Sipho Mbatha, Anne Mastament-Mason
Abstract:
South African manufacturing global competitiveness was ranked 22nd (out of 38 countries), dropped to 24th in 2013 and is expected to drop further to 25th by 2018. These impacts negatively on the industrialisation project of South Africa. For industrialization to be achieved through labour intensive industries like the Apparel Manufacturing Industries of South Africa (AMISA), South Africa needs to identify and respond to factors negatively impacting on the development of competitive advantage This paper applied factor conditions from Porter’s Diamond Model (1990) to understand the various challenges facing the AMISA. Factor conditions highlighted in Porter’s model are grouped into two groups namely, basic and advance factors. Two AMISA associations representing over 10 000 employees were interviewed. The largest Clothing, Textiles and Leather (CTL) apparel retail group was also interviewed with a government department implementing the industrialisation policy were interviewed The paper points out that while AMISA have basic factor conditions necessary for competitive advantage in the clothing and textiles industries, Advance factor coordination has proven to be a challenging task for the AMISA, Higher Education Institutions (HEIs) and government. Poor infrastructural maintenance has contributed to high manufacturing costs and poor quick response as a result of lack of advanced technologies. The use of Porter’s Factor Conditions as a tool to analyse the sector’s competitive advantage challenges and opportunities has increased knowledge regarding factors that limit the AMISA’s competitiveness. It is therefore argued that other studies on Porter’s Diamond model factors like Demand conditions, Firm strategy, structure and rivalry and Related and supporting industries can be used to analyse the situation of the AMISA for the purposes of improving competitive advantage.Keywords: compliance rule, apparel manufacturing industry, factor conditions, advance skills and South African industrial policy
Procedia PDF Downloads 3621656 Use of Smartphones in 6th and 7th Grade (Elementary Schools) in Istria: Pilot Study
Authors: Maja Ruzic-Baf, Vedrana Keteles, Andrea Debeljuh
Abstract:
Younger and younger children are now using a smartphone, a device which has become ‘a must have’ and the life of children would be almost ‘unthinkable’ without one. Devices are becoming lighter and lighter but offering an array of options and applications as well as the unavoidable access to the Internet, without which it would be almost unusable. Numerous features such as taking of photographs, listening to music, information search on the Internet, access to social networks, usage of some of the chatting and messaging services, are only some of the numerous features offered by ‘smart’ devices. They have replaced the alarm clock, home phone, camera, tablet and other devices. Their use and possession have become a part of the everyday image of young people. Apart from the positive aspects, the use of smartphones has also some downsides. For instance, free time was usually spent in nature, playing, doing sports or other activities enabling children an adequate psychophysiological growth and development. The greater usage of smartphones during classes to check statuses on social networks, message your friends, play online games, are just some of the possible negative aspects of their application. Considering that the age of the population using smartphones is decreasing and that smartphones are no longer ‘foreign’ to children of pre-school age (smartphones are used at home or in coffee shops or shopping centers while waiting for their parents, playing video games often inappropriate to their age), particular attention must be paid to a very sensitive group, the teenagers who almost never separate from their ‘pets’. This paper is divided into two sections, theoretical and empirical ones. The theoretical section gives an overview of the pros and cons of the usage of smartphones, while the empirical section presents the results of a research conducted in three elementary schools regarding the usage of smartphones and, specifically, their usage during classes, during breaks and to search information on the Internet, check status updates and 'likes’ on the Facebook social network.Keywords: education, smartphone, social networks, teenagers
Procedia PDF Downloads 4541655 Photophysical Study of Pyrene Butyric Acid in Aqueous Ionic Liquid
Authors: Pratap K. Chhotaray, Jitendriya Swain, Ashok Mishra, Ramesh L. Gardas
Abstract:
Ionic liquids (ILs) are molten salts, consist predominantly of ions and found to be liquid below 100°C. The unparalleled growing interest in ILs is based upon their never ending design flexibility. The use of ILs as a co-solvent in binary as well as a ternary mixture with molecular solvents multifold it’s utility. Since polarity is one of the most widely applied solvent concepts which represents simple and straightforward means for characterizing and ranking the solvent media, its study for a binary mixture of ILs is crucial for its widespread application and development. The primary approach to the assessment of solution phase intermolecular interactions, which generally occurs on the picosecond to nanosecond time scales, is to exploit the optical response of photophysical probe. Pyrene butyric acid (PBA) is used as fluorescence probe due to its high quantum yield, longer lifetime and high solvent polarity dependence of fluorescence spectra. Propylammonium formate (PAF) is the IL used for this study. Both the UV-absorbance spectra and steady state fluorescence intensity study of PBA in different concentration of aqueous PAF, reveals that with an increase in PAF concentration, both the absorbance and fluorescence intensity increases which indicate the progressive solubilisation of PBA. Whereas, near about 50% of IL concentration, all of the PBA molecules get solubilised as there are no changes in the absorbance and fluorescence intensity. Furthermore, the ratio II/IV, where the band II corresponds to the transition from S1 (ν = 0) to S0 (ν = 0), and the band IV corresponds to transition from S1 (ν = 0) to S0 (ν = 2) of PBA, indicates that the addition of water into PAF increases the polarity of the medium. Time domain lifetime study shows an increase in lifetime of PBA towards the higher concentration of PAF. It can be attributed to the decrease in non-radiative rate constant at higher PAF concentration as the viscosity is higher. The monoexponential decay suggests that homogeneity of solvation environment whereas the uneven width at full width at half maximum (FWHM) indicates there might exist some heterogeneity around the fluorophores even in the water-IL mixed solvents.Keywords: fluorescence, ionic liquid, lifetime, polarity, pyrene butyric acid
Procedia PDF Downloads 4601654 Entrepreneurial Leadership in a Startup Context: A Comparative Study on Two Egyptian Startup Businesses
Authors: Nada Basset
Abstract:
Problem Statement: The study examines the important role of leading change inside start-ups and highlights the challenges faced by an entrepreneur during the startup phase of the business. Research Methods/Procedures/Approaches: A qualitative research approach is taken, using the case study analysis method. A comparative study was made between two day care nurseries in Greater Cairo. Non-probability purposive sampling was used and a triangulation of semi-structured interviews, document analysis and participant-observation were applied simultaneously. The in-depth case study analysis took place over a longitudinal study of four calendar months. Results/Findings: Findings demonstrated that leading change in an entrepreneurial setup must be initiated by the entrepreneur, who must also be the owner of the change process. Another important finding showed that the culture of change, although created by the entrepreneur, needs the support and engagement of followers, who should be sharing the same value system and vision of the entrepreneur. Conclusions and Implications: An important implication suggests that during the first year of a start-up lifecycle, special emphasis must be made to the recruitment and selection of personnel, who should play a role into setting the new start-up culture and help it grow or shrink. Another drawn conclusion is that the success of the change must be measured in both quantitative and qualitative terms. Increasing revenues and customer attrition rates -as quantitative KPIs- must be aligned with other qualitative KPIs like customer satisfaction, employee satisfaction, and organizational commitment and business reputation. Originality of Paper: The paper addresses change management in an entrepreneurial concept, with an empirical application on an Egyptian start-up model providing a service to both adults and children. This privileges the research as the constructs measured merged together the level of satisfaction of employees, decision-makers (parents of children), and the users (children).Keywords: leadership, change management, entrepreneurship, startup business
Procedia PDF Downloads 1851653 Hydrological Analysis for Urban Water Management
Authors: Ranjit Kumar Sahu, Ramakar Jha
Abstract:
Urban Water Management is the practice of managing freshwater, waste water, and storm water as components of a basin-wide management plan. It builds on existing water supply and sanitation considerations within an urban settlement by incorporating urban water management within the scope of the entire river basin. The pervasive problems generated by urban development have prompted, in the present work, to study the spatial extent of urbanization in Golden Triangle of Odisha connecting the cities Bhubaneswar (20.2700° N, 85.8400° E), Puri (19.8106° N, 85.8314° E) and Konark (19.9000° N, 86.1200° E)., and patterns of periodic changes in urban development (systematic/random) in order to develop future plans for (i) urbanization promotion areas, and (ii) urbanization control areas. Remote Sensing, using USGS (U.S. Geological Survey) Landsat8 maps, supervised classification of the Urban Sprawl has been done for during 1980 - 2014, specifically after 2000. This Work presents the following: (i) Time series analysis of Hydrological data (ground water and rainfall), (ii) Application of SWMM (Storm Water Management Model) and other soft computing techniques for Urban Water Management, and (iii) Uncertainty analysis of model parameters (Urban Sprawl and correlation analysis). The outcome of the study shows drastic growth results in urbanization and depletion of ground water levels in the area that has been discussed briefly. Other relative outcomes like declining trend of rainfall and rise of sand mining in local vicinity has been also discussed. Research on this kind of work will (i) improve water supply and consumption efficiency (ii) Upgrade drinking water quality and waste water treatment (iii) Increase economic efficiency of services to sustain operations and investments for water, waste water, and storm water management, and (iv) engage communities to reflect their needs and knowledge for water management.Keywords: Storm Water Management Model (SWMM), uncertainty analysis, urban sprawl, land use change
Procedia PDF Downloads 4281652 Effect of Rehabilitative Nursing Program on Pain Intensity and Functional Status among Patients with Discectomy
Authors: Amal Shehata
Abstract:
Low back pain related to disc prolapse is localized in the lumbar area and it may be radiated to the lower extremities, starting from neurons near or around the spinal canal. Most of the population may be affected with disc prolapse within their lifetime and leads to lost productivity, disability and loss of function. The study purpose was to examine the effect of rehabilitative nursing program on pain intensity and functional status among patients with discectomy. Design: Aquasi experimental design was utilized. Setting: The study was carried out at neurosurgery department and out patient's clinic of Menoufia University and Teaching hospitals at Menoufia governorate, Egypt. Instrument of the study: Five Instruments were used for data collection: Structured interviewing questionnaire, Functional assessment instrument, Observational check list, Numeric rating Scale and Oswestry low back pain disability questionnaire. Results: There was an improvement in mean total knowledge score about disease process, discectomy and rehabilitation program in study group (25.32%) than control group (7.32%). There was highly statistically significant improvement in lumbar flexibility among study group (80%) than control group (30%) after rehabilitation program than before. Also there was a decrease in pain score in study group (58% no pain) than control group (28% no pain) after rehabilitation program. There was an improvement in total disability score of study group (zero %) regarding effect of pain on the activity of daily living after rehabilitation program than control group (16%). Conclusion: Application of rehabilitative nursing program for patient with discectomy had proven a positive effect in relation to knowledge score, pain reduction, activity of daily living and functional abilities. Recommendation: A continuous rehabilitative nursing program should be carried out for all patients immediately after discectomy surgery on regular basis. Also A colored illustrated booklet about rehabilitation program should be available and distributed for all patients before surgery.Keywords: discectomy, rehabilitative nursing program, pain intensity, functional status
Procedia PDF Downloads 142