Search results for: supply chain methodology
6855 Application of Groundwater Level Data Mining in Aquifer Identification
Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen
Abstract:
Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.Keywords: aquifer identification, decision tree, groundwater, Fourier transform
Procedia PDF Downloads 1576854 Solving Extended Linear Complementarity Problems (XLCP) - Wood and Environment
Authors: Liberto Pombal, Christian Dieter Jaekel
Abstract:
The objective of this work is to establish theoretical and numerical conditions for Solving Extended Linear Complementarity Problems (XLCP), with emphasis on the Horizontal Linear Complementarity Problem (HLCP). Two new strategies for solving complementarity problems are presented, using differentiable and penalized functions, which resulted in a natural formalization for the Linear Horizontal case. The computational results of all suggested strategies are also discussed in depth in this paper. The implication in practice allows solving and optimizing, in an innovative way, the (forestry) problems of the value chain of the industrial wood sector in Angola.Keywords: complementarity, box constrained, optimality conditions, wood and environment
Procedia PDF Downloads 566853 Ancient Iran Water Technologies
Authors: Akbar Khodavirdizadeh, Ali Nemati Babaylou, Hassan Moomivand
Abstract:
The history of human access to water technique has been one of the factors in the formation of human civilizations in the ancient world. The technique that makes surface water and groundwater accessible to humans on the ground has been a clever technique in human life to reach the water. In this study, while examining the water technique of ancient Iran using the Qanats technique, the water supply system of different regions of the ancient world were also studied and compared. Six groups of the ancient region of ancient Greece (Archaic 480-750 BC and Classical 223-480 BC), Urartu in Tuspa (600-850 BC), Petra (106-168 BC), Ancient Rome (265 BC), and the ancient United States (1450 BC) and ancient Iranian water technologies were studied under water supply systems. Past water technologies in these areas: water transmission systems in primary urban centers, use of water structures in water control, use of bridges in water transfer, construction of waterways for water transfer, storage of rainfall, construction of various types of pottery- ceramic, lead, wood and stone pipes have been used in water transfer, flood control, water reservoirs, dams, channel, wells, and Qanat. The central plateau of Iran is one of the arid and desert regions. Archaeological, geomorphological, and paleontological studies of the central region of the Iranian plateau showed that without the use of Qanats, the possibility of urban civilization in this region was difficult and even impossible. Zarch aqueduct is the most important aqueduct in Yazd region. Qanat of Zarch is a plain Qanat with a gallery length of 80 km; its mother well is 85 m deep and has 2115 well shafts. The main purpose of building the Qanat of Zārch was to access the groundwater source and transfer it to the surface of the ground. Regarding the structure of the aqueduct and the technique of transferring water from the groundwater source to the surface, it has a great impact on being different from other water techniques in the ancient world. The results show that the use of water technologies in ancient is very important to understand the history of humanity in the use of hydraulic techniques.Keywords: ancient water technologies, groundwaters, qanat, human history, Ancient Iran
Procedia PDF Downloads 1126852 Approaches to Reduce the Complexity of Mathematical Models for the Operational Optimization of Large-Scale Virtual Power Plants in Public Energy Supply
Authors: Thomas Weber, Nina Strobel, Thomas Kohne, Eberhard Abele
Abstract:
In context of the energy transition in Germany, the importance of so-called virtual power plants in the energy supply continues to increase. The progressive dismantling of the large power plants and the ongoing construction of many new decentralized plants result in great potential for optimization through synergies between the individual plants. These potentials can be exploited by mathematical optimization algorithms to calculate the optimal application planning of decentralized power and heat generators and storage systems. This also includes linear or linear mixed integer optimization. In this paper, procedures for reducing the number of decision variables to be calculated are explained and validated. On the one hand, this includes combining n similar installation types into one aggregated unit. This aggregated unit is described by the same constraints and target function terms as a single plant. This reduces the number of decision variables per time step and the complexity of the problem to be solved by a factor of n. The exact operating mode of the individual plants can then be calculated in a second optimization in such a way that the output of the individual plants corresponds to the calculated output of the aggregated unit. Another way to reduce the number of decision variables in an optimization problem is to reduce the number of time steps to be calculated. This is useful if a high temporal resolution is not necessary for all time steps. For example, the volatility or the forecast quality of environmental parameters may justify a high or low temporal resolution of the optimization. Both approaches are examined for the resulting calculation time as well as for optimality. Several optimization models for virtual power plants (combined heat and power plants, heat storage, power storage, gas turbine) with different numbers of plants are used as a reference for the investigation of both processes with regard to calculation duration and optimality.Keywords: CHP, Energy 4.0, energy storage, MILP, optimization, virtual power plant
Procedia PDF Downloads 1786851 High Purity Lignin for Asphalt Applications: Using the Dawn Technology™ Wood Fractionation Process
Authors: Ed de Jong
Abstract:
Avantium is a leading technology development company and a frontrunner in renewable chemistry. Avantium develops disruptive technologies that enable the production of sustainable high value products from renewable materials and actively seek out collaborations and partnerships with like-minded companies and academic institutions globally, to speed up introductions of chemical innovations in the marketplace. In addition, Avantium helps companies to accelerate their catalysis R&D to improve efficiencies and deliver increased sustainability, growth, and profits, by providing proprietary systems and services to this regard. Many chemical building blocks and materials can be produced from biomass, nowadays mainly from 1st generation based carbohydrates, but potential for competition with the human food chain leads brand-owners to look for strategies to transition from 1st to 2nd generation feedstock. The use of non-edible lignocellulosic feedstock is an equally attractive source to produce chemical intermediates and an important part of the solution addressing these global issues (Paris targets). Avantium’s Dawn Technology™ separates the glucose, mixed sugars, and lignin available in non-food agricultural and forestry residues such as wood chips, wheat straw, bagasse, empty fruit bunches or corn stover. The resulting very pure lignin is dense in energy and can be used for energy generation. However, such a material might preferably be deployed in higher added value applications. Bitumen, which is fossil based, are mostly used for paving applications. Traditional hot mix asphalt emits large quantities of the GHG’s CO₂, CH₄, and N₂O, which is unfavorable for obvious environmental reasons. Another challenge for the bitumen industry is that the petrochemical industry is becoming more and more efficient in breaking down higher chain hydrocarbons to lower chain hydrocarbons with higher added value than bitumen. This has a negative effect on the availability of bitumen. The asphalt market, as well as governments, are looking for alternatives with higher sustainability in terms of GHG emission. The usage of alternative sustainable binders, which can (partly) replace the bitumen, contributes to reduce GHG emissions and at the same time broadens the availability of binders. As lignin is a major component (around 25-30%) of lignocellulosic material, which includes terrestrial plants (e.g., trees, bushes, and grass) and agricultural residues (e.g., empty fruit bunches, corn stover, sugarcane bagasse, straw, etc.), it is globally highly available. The chemical structure shows resemblance with the structure of bitumen and could, therefore, be used as an alternative for bitumen in applications like roofing or asphalt. Applications such as the use of lignin in asphalt need both fundamental research as well as practical proof under relevant use conditions. From a fundamental point of view, rheological aspects, as well as mixing, are key criteria. From a practical point of view, behavior in real road conditions is key (how easy can the asphalt be prepared, how easy can it be applied on the road, what is the durability, etc.). The paper will discuss the fundamentals of the use of lignin as bitumen replacement as well as the status of the different demonstration projects in Europe using lignin as a partial bitumen replacement in asphalts and will especially present the results of using Dawn Technology™ lignin as partial replacement of bitumen.Keywords: biorefinery, wood fractionation, lignin, asphalt, bitumen, sustainability
Procedia PDF Downloads 1546850 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 116849 Creating Knowledge Networks: Comparative Analysis of Reference Cases
Authors: Sylvia Villarreal, Edna Bravo
Abstract:
Knowledge management focuses on coordinating technologies, people, processes, and structures to generate a competitive advantage and considering that networks are perceived as mechanisms for knowledge creation and transfer, this research presents the stages and practices related to the creation of knowledge networks. The methodology started with a literature review adapted from the systematic literature review (SLR). The descriptive analysis includes variables such as approach (conceptual or practical), industry, knowledge management processes and mythologies (qualitative or quantitative), etc. The content analysis includes identification of reference cases. These cases were characterized based on variables as scope, creation goal, years, network approach, actors and creation methodology. It was possible to do a comparative analysis to determinate similarities and differences in these cases documented in knowledge network scientific literature. Consequently, it was shown that even the need and impact of knowledge networks in organizations, the initial guidelines for their creation are not documented, so there is not a guide of good practices and lessons learned. The reference cases are from industries as energy, education, creative, automotive and textile. Their common points are the human approach; it is oriented to interactions to facilitate the appropriation of knowledge, explicit and tacit. The stages of every case are analyzed to propose the main successful elements.Keywords: creation, knowledge management, network, stages
Procedia PDF Downloads 3026848 Some Considerations about the Theory of Spatial-Motor Thinking Applied to a Traditional Fife Band in Brazil
Authors: Murilo G. Mendes
Abstract:
This text presents part of the results presented in the Ph.D. thesis that has used John Baily's theory and method as well as its ethnographic application in the context of the fife flutes of the Banda Cabaçal dos Irmãos Aniceto in the state of Ceará, northeast of Brazil. John Baily is a British ethnomusicologist dedicated to studying the relationships between music, musical gesture, and embodied cognition. His methodology became a useful tool to highlight historical-social aspects present in the group's instrumental music. Remaining indigenous and illiterate, these musicians played and transmitted their music from generation to generation, for almost two hundred years, without any nomenclature or systematization of the fingering performed on the flute. In other words, his music, free from any theorization, is learned, felt, perceived, and processed directly through hearing and through the relationship between the instrument's motor skills and its sound result. For this reason, Baily's assumptions became fundamental in the analysis processes. As the author's methodology recommends, classes were held with the natives and provided technical musical learning and some important concepts. Then, transcriptions and analyses of musical aspects were made from patterns of movement on the instrument incorporated by repetitions and/or by the intrinsic facility of the instrument. As a result, it was discovered how the group reconciled its indigenous origins with the demand requested by the public power and the interests of the local financial elite from the mid-twentieth century. The article is structured from the cultural context of the group, where local historical and social aspects influence the social and musical practices of the group. Then, will be present the methodological conceptions of John Baily and, finally, their application in the music of the Irmãos Aniceto. The conclusion points to the good results of identifying, through this methodology and analysis, approximations between discourse, historical-social factors, and musical text. Still, questions are raised about its application in other contexts.Keywords: Banda Cabaçal dos Irmãos Aniceto, John Baily, pífano, spatial-motor thinking
Procedia PDF Downloads 1356847 Development of a Technology Assessment Model by Patents and Customers' Review Data
Authors: Kisik Song, Sungjoo Lee
Abstract:
Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.Keywords: technology assessment, patents, citation information, opinion mining
Procedia PDF Downloads 4666846 Parallel Gripper Modelling and Design Optimization Using Multi-Objective Grey Wolf Optimizer
Authors: Golak Bihari Mahanta, Bibhuti Bhusan Biswal, B. B. V. L. Deepak, Amruta Rout, Gunji Balamurali
Abstract:
Robots are widely used in the manufacturing industry for rapid production with higher accuracy and precision. With the help of End-of-Arm Tools (EOATs), robots are interacting with the environment. Robotic grippers are such EOATs which help to grasp the object in an automation system for improving the efficiency. As the robotic gripper directly influence the quality of the product due to the contact between the gripper surface and the object to be grasped, it is necessary to design and optimize the gripper mechanism configuration. In this study, geometric and kinematic modeling of the parallel gripper is proposed. Grey wolf optimizer algorithm is introduced for solving the proposed multiobjective gripper optimization problem. Two objective functions developed from the geometric and kinematic modeling along with several nonlinear constraints of the proposed gripper mechanism is used to optimize the design variables of the systems. Finally, the proposed methodology compared with a previously proposed method such as Teaching Learning Based Optimization (TLBO) algorithm, NSGA II, MODE and it was seen that the proposed method is more efficient compared to the earlier proposed methodology.Keywords: gripper optimization, metaheuristics, , teaching learning based algorithm, multi-objective optimization, optimal gripper design
Procedia PDF Downloads 1886845 Tree-Based Inference for Regionalization: A Comparative Study of Global Topological Perturbation Methods
Authors: Orhun Aydin, Mark V. Janikas, Rodrigo Alves, Renato Assuncao
Abstract:
In this paper, a tree-based perturbation methodology for regionalization inference is presented. Regionalization is a constrained optimization problem that aims to create groups with similar attributes while satisfying spatial contiguity constraints. Similar to any constrained optimization problem, the spatial constraint may hinder convergence to some global minima, resulting in spatially contiguous members of a group with dissimilar attributes. This paper presents a general methodology for rigorously perturbing spatial constraints through the use of random spanning trees. The general framework presented can be used to quantify the effect of the spatial constraints in the overall regionalization result. We compare several types of stochastic spanning trees used in inference problems such as fuzzy regionalization and determining the number of regions. Performance of stochastic spanning trees is juxtaposed against the traditional permutation-based hypothesis testing frequently used in spatial statistics. Inference results for fuzzy regionalization and determining the number of regions is presented on the Local Area Personal Incomes for Texas Counties provided by the Bureau of Economic Analysis.Keywords: regionalization, constrained clustering, probabilistic inference, fuzzy clustering
Procedia PDF Downloads 2296844 Increasing Business Competitiveness in Georgia in Terms of Globalization
Authors: Badri Gechbaia, Levan Gvarishvili
Abstract:
Despite the fact that a lot of Georgian scientists have worked on the issue of the business competitiveness, it think that it is necessary to deepen the works in this sphere, it is necessary also to perfect the methodology in the estimation of the business competitiveness, we have to display the main factors which define the competitive advantages in the business sphere, we have also to establish the interconnections between the business competitiveness level and the quality of states economical involvement in the international economic processes, we have to define the ways to rise the business competitiveness and its role in the upgrading of countries economic development. The introduction part justifies the actuality of the studied topic and the thesis; It defines the survey subject, the object, and the goals with relevant objectives; theoretical-methodological and informational-statistical base for the survey; what is new in the survey and what the value for its theoretical and practical application is. The aforementioned study is an effort to raise public awareness on this issue. Analysis of the fundamental conditions for the efficient functioning of business in Georgia, identification of reserves for increasing its efficiency based on the assessment of the strengths and weaknesses of the business sector. Methods of system analysis, abstract-logic, induction and deduction, synthesis and generalization, and positive, normative, and comparative analysis are used in the research process. Specific regularities of the impact of the globalization process on the determinants of business competitiveness are established. The reasons for business competitiveness in Georgia have been identifiedKeywords: competitiveness, methodology, georgian, economic
Procedia PDF Downloads 1136843 Clinical Validation of C-PDR Methodology for Accurate Non-Invasive Detection of Helicobacter pylori Infection
Authors: Suman Som, Abhijit Maity, Sunil B. Daschakraborty, Sujit Chaudhuri, Manik Pradhan
Abstract:
Background: Helicobacter pylori is a common and important human pathogen and the primary cause of peptic ulcer disease and gastric cancer. Currently H. pylori infection is detected by both invasive and non-invasive way but the diagnostic accuracy is not up to the mark. Aim: To set up an optimal diagnostic cut-off value of 13C-Urea Breath Test to detect H. pylori infection and evaluate a novel c-PDR methodology to overcome of inconclusive grey zone. Materials and Methods: All 83 subjects first underwent upper-gastrointestinal endoscopy followed by rapid urease test and histopathology and depending on these results; we classified 49 subjects as H. pylori positive and 34 negative. After an overnight, fast patients are taken 4 gm of citric acid in 200 ml water solution and 10 minute after ingestion of the test meal, a baseline exhaled breath sample was collected. Thereafter an oral dose of 75 mg 13C-Urea dissolved in 50 ml water was given and breath samples were collected upto 90 minute for 15 minute intervals and analysed by laser based high precisional cavity enhanced spectroscopy. Results: We studied the excretion kinetics of 13C isotope enrichment (expressed as δDOB13C ‰) of exhaled breath samples and found maximum enrichment around 30 minute of H. pylori positive patients, it is due to the acid mediated stimulated urease enzyme activity and maximum acidification happened within 30 minute but no such significant isotopic enrichment observed for H. pylori negative individuals. Using Receiver Operating Characteristic (ROC) curve an optimal diagnostic cut-off value, δDOB13C ‰ = 3.14 was determined at 30 minute exhibiting 89.16% accuracy. Now to overcome grey zone problem we explore percentage dose of 13C recovered per hour, i.e. 13C-PDR (%/hr) and cumulative percentage dose of 13C recovered, i.e. c-PDR (%) in exhaled breath samples for the present 13C-UBT. We further explored the diagnostic accuracy of 13C-UBT by constructing ROC curve using c-PDR (%) values and an optimal cut-off value was estimated to be c-PDR = 1.47 (%) at 60 minute, exhibiting 100 % diagnostic sensitivity , 100 % specificity and 100 % accuracy of 13C-UBT for detection of H. pylori infection. We also elucidate the gastric emptying process of present 13C-UBT for H. pylori positive patients. The maximal emptying rate found at 36 minute and half empting time of present 13C-UBT was found at 45 minute. Conclusions: The present study exhibiting the importance of c-PDR methodology to overcome of grey zone problem in 13C-UBT for accurate determination of infection without any risk of diagnostic errors and making it sufficiently robust and novel method for an accurate and fast non-invasive diagnosis of H. pylori infection for large scale screening purposes.Keywords: 13C-Urea breath test, c-PDR methodology, grey zone, Helicobacter pylori
Procedia PDF Downloads 3016842 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument
Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki
Abstract:
According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test
Procedia PDF Downloads 3026841 UF as Pretreatment of RO for Tertiary Treatment of Biologically Treated Distillery Spentwash
Authors: Pinki Sharma, Himanshu Joshi
Abstract:
Distillery spentwash contains high chemical oxygen demand (COD), biological oxygen demand (BOD), color, total dissolved solids (TDS) and other contaminants even after biological treatment. The effluent can’t be discharged as such in the surface water bodies or land without further treatment. Reverse osmosis (RO) treatment plants have been installed in many of the distilleries at tertiary level. But at most of the places these plants are not properly working due to high concentration of organic matter and other contaminants in biologically treated spentwash. To make the membrane treatment proven and reliable technology, proper pre-treatment is mandatory. In the present study, ultra-filtration (UF) as pre-treatment of RO at tertiary stage was performed. Operating parameters namely initial pH (pHo: 2–10), trans-membrane pressure (TMP: 4-20 bars) and temperature (T: 15- 43°C) used for conducting experiments with UF system. Experiments were optimized at different operating parameters in terms of COD, color, TDS and TOC removal by using response surface methodology (RSM) with central composite design. The results showed that removal of COD, color and TDS by 62%, 93.5% and 75.5%, with UF, respectively at optimized conditions with increased permeate flux from 17.5 l/m2/h (RO) to 38 l/m2/h (UF-RO). The performance of the RO system was greatly improved both in term of pollutant removal as well as water recovery.Keywords: bio-digested distillery spentwash, reverse osmosis, response surface methodology, ultra-filtration
Procedia PDF Downloads 3476840 Carbon Footprint Assessment and Application in Urban Planning and Geography
Authors: Hyunjoo Park, Taehyun Kim, Taehyun Kim
Abstract:
Human life, activity, and culture depend on the wider environment. Cities offer economic opportunities for goods and services, but cannot exist in environments without food, energy, and water supply. Technological innovation in energy supply and transport speeds up the expansion of urban areas and the physical separation from agricultural land. As a result, division of urban agricultural areas causes more energy demand for food and goods transport between the regions. As the energy resources are leaking all over the world, the impact on the environment crossing the boundaries of cities is also growing. While advances in energy and other technologies can reduce the environmental impact of consumption, there is still a gap between energy supply and demand by current technology, even in technically advanced countries. Therefore, reducing energy demand is more realistic than relying solely on the development of technology for sustainable development. The purpose of this study is to introduce the application of carbon footprint assessment in fields of urban planning and geography. In urban studies, carbon footprint has been assessed at different geographical scales, such as nation, city, region, household, and individual. Carbon footprint assessment for a nation and a city is available by using national or city level statistics of energy consumption categories. By means of carbon footprint calculation, it is possible to compare the ecological capacity and deficit among nations and cities. Carbon footprint also offers great insight on the geographical distribution of carbon intensity at a regional level in the agricultural field. The study shows the background of carbon footprint applications in urban planning and geography by case studies such as figuring out sustainable land-use measures in urban planning and geography. For micro level, footprint quiz or survey can be adapted to measure household and individual carbon footprint. For example, first case study collected carbon footprint data from the survey measuring home energy use and travel behavior of 2,064 households in eight cities in Gyeonggi-do, Korea. Second case study analyzed the effects of the net and gross population densities on carbon footprint of residents at an intra-urban scale in the capital city of Seoul, Korea. In this study, the individual carbon footprint of residents was calculated by converting the carbon intensities of home and travel fossil fuel use of respondents to the unit of metric ton of carbon dioxide (tCO₂) by multiplying the conversion factors equivalent to the carbon intensities of each energy source, such as electricity, natural gas, and gasoline. Carbon footprint is an important concept not only for reducing climate change but also for sustainable development. As seen in case studies carbon footprint may be measured and applied in various spatial units, including but not limited to countries and regions. These examples may provide new perspectives on carbon footprint application in planning and geography. In addition, additional concerns for consumption of food, goods, and services can be included in carbon footprint calculation in the area of urban planning and geography.Keywords: carbon footprint, case study, geography, urban planning
Procedia PDF Downloads 2896839 Design of a Customized Freshly-Made Fruit Salad and Juices Vending Machine
Authors: María Laura Guevara Campos
Abstract:
The increasing number of vending machines makes it easy for people to find them more frequently in stores, universities, workplaces, and even hospitals. These machines usually offer products with high contents of sugar and fat, which, if consumed regularly, can result in serious health threats, as overweight and obesity. Additionally, the energy consumption of these machines tends to be high, which has an impact on the environment as well. In order to promote the consumption of healthy food, a vending machine was designed to give the customer the opportunity to choose between a customized fruit salad and a customized fruit juice, both of them prepared instantly with the ingredients selected by the customer. The main parameters considered to design the machine were: the storage of the preferred fruits in a salad and/or in a juice according to a survey, the size of the machine, the use of ecologic recipients, and the overall energy consumption. The methodology used for the design was the one proposed by the German Association of Engineers for mechatronics systems, which breaks the design process in several stages, from the elaboration of a list of requirements through the establishment of the working principles and the design concepts to the final design of the machine, which was done in a 3D modelling software. Finally, with the design of this machine, the aim is to contribute to the development and implementation of healthier vending machines that offer freshly-made products, which is not being widely attended at present.Keywords: design, design methodology, mechatronics systems, vending machines
Procedia PDF Downloads 1336838 Genotyping of Rotaviruses in Pediatric Patients with Gastroenteritis by Using Real-Time Reverse Transcription Polymerase Chain Reaction
Authors: Recep Kesli, Cengiz Demir, Riza Durmaz, Zekiye Bakkaloglu, Aysegul Bukulmez
Abstract:
Objective: Acute diarrhea disease in children is a major cause of morbidity worldwide and is a leading cause of mortality, and it is the most common agent responsible for acute gastroenteritis in developing countries. With hospitalized children suffering from acute enteric disease up to 50% of the analyzed specimen were positive for rotavirus. Further molecular surveillance could provide a sound basis for improving the response to epidemic gastroenteritis and could provide data needed for the introduction of vaccination programmes in the country. The aim of this study was to investigate the prevalence of viral etiology of the gastroenteritis in children aged 0-6 years with acute gastroenteritis and to determine predominant genotypes of rotaviruses in the province of Afyonkarahisar, Turkey. Methods: An epidemiological study on rotavirus was carried out during 2016. Fecal samples obtained from the 144 rotavirus positive children with 0-6 years of ages and applied to the Pediatric Diseases Outpatient of ANS Research and Practice Hospital, Afyon Kocatepe University with the complaint of diarrhea. Bacterial agents causing gastroenteritis were excluded by using bacteriological culture methods and finally, no growth observed. Rotavirus antigen was examined by both the immunochromatographic (One Step Rotavirus and Adenovirus Combo Test, China) and ELISA (Premier Rotaclone, USA) methods in stool samples. Rotavirus RNA was detected by using one step real-time reverse transcription-polymerase chain reaction (RT-PCR). G and P genotypes were determined using RT-PCR with consensus primers of VP7 and VP4 genes, followed by semi nested type-specific multiplex PCR. Results: Of the total 144 rotavirus antigen-positive samples with RT-PCR, 4 (2,8%) were rejected, 95 (66%) were examined, and 45 (31,2%) have not been examined for PCR yet. Ninety-one (95,8%) of the 95 examined samples were found to be rotavirus positive with RT-PCR. Rotavirus subgenotyping distributions in G, P and G/P genotype groups were determined as; G1:45%, G2:27%, G3:13%, G9:13%, G4:1% and G12:1% for G genotype, and P[4]:33%, P[8]:66%, P[10]:1% for P genotype, and G1P[8]:%37, G2P[4]:%21, G3P[8]:%10, G4P[8]:%1, G9P[8]:%8, G2P[8]:%3 for G/P genotype . Not common genotype combination were %20 in G/P genotype. Conclusions: This study subscribes to the global agreement of the molecular epidemiology of rotavirus which will be useful in guiding the alternative and application of rotavirus vaccines or effective control and interception. Determining the diversity and rates of rotavirus genotypes will definitely provide guidelines for developing the most suitable vaccine.Keywords: gastroenteritis, genotyping, rotavirus, RT-PCR
Procedia PDF Downloads 2416837 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism
Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le
Abstract:
This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.Keywords: flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment
Procedia PDF Downloads 3316836 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations
Authors: Hamza Javar Magnier, Robin Curtis
Abstract:
There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation
Procedia PDF Downloads 3616835 Radionuclide Determination Study for Some Fish Species in Kuwait
Authors: Ahmad Almutairi
Abstract:
Kuwait lies to the northwest of the Arabian Gulf. The levels of radionuclides are unknown in this area. Radionuclide like ²¹⁰Po, ²²⁶Ra, and ⁹⁰Sr accumulated in certain body tissues and bones, relate primarily to dietary uptake and inhalation. A large fraction of radiation exposure experienced by individuals comes from food chain transfer. In this study, some types of Kuwait fish were studied for radionuclide determination. These fish were taken from the Kuwaiti water territory during May. The study is to determine the radiation exposure for ²¹⁰Po in some fish species in Kuwait the ²¹⁰Po concentration was found to be between 0.089 and 2.544 Bq/kg the highs was in Zubaidy and the lowest was in Hamour.Keywords: the radionuclide, radiation exposure, fish species, Zubaida, Hamour
Procedia PDF Downloads 2036834 Scrum Challenges and Mitigation Practices in Global Software Development of an Integrated Learning Environment: Case Study of Science, Technology, Innovation, Mathematics, Engineering for the Young
Authors: Evgeniia Surkova, Manal Assaad, Hleb Makeyeu, Juho Makio
Abstract:
The main objective of STIMEY (Science, Technology, Innovation, Mathematics, Engineering for the Young) project is the delivery of a hybrid learning environment that combines multi-level components such as social media concepts, robotic artefacts, and radio, among others. It is based on a well-researched pedagogical framework to attract European youths to STEM (science, technology, engineering, and mathematics) education and careers. To develop and integrate these various components, STIMEY is executed in iterative research cycles leading to progressive improvements. Scrum was the development methodology of choice in the project, as studies indicated its benefits as an agile methodology in global software development, especially of e-learning and integrated learning projects. This paper describes the project partners’ experience with the Scrum framework, discussing the challenges faced in its implementation and the mitigation practices employed. The authors conclude with exploring user experience tools and principles for future research, as a novel direction in supporting the Scrum development team.Keywords: e-learning, global software development, scrum, STEM education
Procedia PDF Downloads 1796833 Managing Sunflower Price Risk from a South African Oil Crushing Company’s Perspective
Authors: Daniel Mokatsanyane, Johnny Jansen Van Rensburg
Abstract:
The integral role oil-crushing companies play in sunflower oil production is often overlooked to offer high-quality oil to refineries and end consumers. Sunflower oil crushing companies in South Africa are exposed to price fluctuations resulting from the local and international markets. Hedging instruments enable these companies to hedge themselves against unexpected prices spikes and to ensure sustained profitability. A crushing company is a necessary middleman, and as such, these companies have exposure to the purchasing and selling sides of sunflower. Sunflower oil crushing companies purchase sunflower seeds from farmers or agricultural companies that provide storage facilities. The purchasing price is determined by the supply and demand of sunflower seed, both national and international. When the price of sunflower seeds in South Africa is high but still below import parity, then the crush margins realised by these companies are reduced or even negative at times. There are three main products made by sunflower oil crushing companies, oil, meal, and shells. Profits are realised from selling three products, namely, sunflower oil, meal and shells. However, when selling sunflower oil to refineries, sunflower oil crushing companies needs to hedge themselves against a reduction in vegetable oil prices. Hedging oil prices is often done via futures and is subject to specific volume commitments before a hedge position can be taken in. Furthermore, South African oil-crushing companies hedge sunflower oil with international, Over-the-counter contracts as South Africa is a price taker of sunflower oil and not a price maker. As such, South Africa provides a fraction of the world’s sunflower oil supply and, therefore, has minimal influence on price changes. The advantage of hedging using futures ensures that the sunflower crushing company will know the profits they will realise, but the downside is that they can no longer benefit from a price increase. Alternative hedging instruments like options might pose a solution to the opportunity cost does not go missing and that profit margins are locked in at the best possible prices for the oil crushing company. This paper aims to investigate the possibility of employing options alongside futures to simulate different scenarios to determine if options can bridge the opportunity cost gap.Keywords: derivatives, hedging, price risk, sunflower, sunflower oil, South Africa
Procedia PDF Downloads 1656832 Determinants of Hospital Obstetric Unit Closures in the United States 2002-2013: Loss of Hospital Obstetric Care 2002-2013
Authors: Peiyin Hung, Katy Kozhimannil, Michelle Casey, Ira Moscovice
Abstract:
Background/Objective: The loss of obstetric services has been a pressing concern in urban and rural areas nationwide. This study aims to determine factors that contribute to the loss of obstetric care through closures of a hospital or obstetric unit. Methods: Data from 2002-2013 American Hospital Association annual surveys were used to identify hospitals providing obstetric services. We linked these data to Medicare Healthcare Cost Report Information for hospital financial indicators, the US Census Bureau’s American Community Survey for zip-code level characteristics, and Area Health Resource files for county- level clinician supply measures. A discrete-time multinomial logit model was used to determine contributing factors to obstetric unit or hospital closures. Results: Of 3,551 hospitals providing obstetrics services during 2002-2013, 82% kept units open, 12% stopped providing obstetrics services, and 6% closed down completely. State-level variations existed. Factors that significantly increased hospitals’ probability of obstetric unit closures included lower than 250 annual birth volume (adjusted marginal effects [95% confidence interval]=34.1% [28%, 40%]), closer proximity to another hospital with obstetric services (per 10 miles: -1.5% [-2.4, -0.5%]), being in a county with lower family physician supply (-7.8% [-15.0%, -0.6%), being in a zip code with higher percentage of non-white females (per 10%: 10.2% [2.1%, 18.3%]), and with lower income (per $1,000 income: -0.14% [-0.28%, -0.01%]). Conclusions: Over the past 12 years, loss of obstetric services has disproportionately affected areas served by low-volume urban and rural hospitals, non-white and low-income communities, and counties with fewer family physicians, signaling a need to address maternity care access in these communities.Keywords: access to care, obstetric care, service line discontinuation, hospital, obstetric unit closures
Procedia PDF Downloads 2226831 Configuration of Water-Based Features in Islamic Heritage Complexes and Vernacular Architecture: An Analysis into Interactions of Morphology, Form, and Climatic Performance
Authors: Mustaffa Kamal Bashar Mohd Fauzi, Puteri Shireen Jahn Kassim, Nurul Syala Abdul Latip
Abstract:
It is increasingly realized that sustainability includes both a response to the climatic and cultural context of a place. To assess the cultural context, a morphological analysis of urban patterns from heritage legacies is necessary. While the climatic form is derived from an analysis of meteorological data, cultural patterns and forms must be abstracted from a typological and morphological study. This current study aims to analyzes morphological and formal elements of water-based architectural and urban design of past Islamic vernacular complexes in the hot arid regions and how a vast utilization of water was shaped and sited to act as cooling devices for an entire complex. Apart from its pleasant coolness, water can be used in an aesthetically way such as emphasizing visual axes, vividly enhancing the visual of the surrounding environment and symbolically portraying the act of purity in the design. By comparing 2 case studies based on the analysis of interactions of water features into the form, planning and morphology of 2 Islamic heritage complexes, Fatehpur Sikri (India) and Lahore Fort (Pakistan) with a focus on Shish Mahal of Lahore Fort in terms of their mass, architecture and urban planning, it is agreeable that water plays an integral role in their climatic amelioration via different methods of water conveyance system. Both sites are known for their substantial historical values and prominent for their sustainable vernacular buildings for example; the courtyard of Shish Mahal in Lahore fort are designed to provide continuous coolness by constructing various miniatures water channels that run underneath the paved courtyard. One of the most remarkable features of this system that all water is made dregs-free before it was inducted into these underneath channels. In Fatehpur Sikri, the method of conveyance seems differed from Lahore Fort as the need to supply water to the ridge where Fatehpur Sikri situated is become the major challenges. Thus, the achievement of supplying water to the palatial complexes is solved by placing inhabitable water buildings within the two supply system for raising water. The process of raising the water can be either mechanical or laborious inside the enclosed well and water rising houses. The studies analyzes and abstract the water supply forms, patterns and flows in 3-dimensional shapes through the actions of evaporative cooling and wind-induced ventilation under arid climates. Through the abstraction analytical and descriptive relational morphology of the spatial configurations, the studies can suggest the idealized spatial system that can be used in urban design and complexes which later became a methodological and abstraction tool of sustainability to suit the modern contemporary world.Keywords: heritage site, Islamic vernacular architecture, water features, morphology, urban design
Procedia PDF Downloads 3756830 Error Analysis in Academic Writing of EFL Learners: A Case Study for Undergraduate Students at Pathein University
Authors: Aye Pa Pa Myo
Abstract:
Writing in English is accounted as a complex process for English as a foreign language learners. Besides, committing errors in writing can be found as an inevitable part of language learners’ writing. Generally, academic writing is quite difficult for most of the students to manage for getting better scores. Students can commit common errors in their writings when they try to write academic writing. Error analysis deals with identifying and detecting the errors and also explains the reason for the occurrence of these errors. In this paper, the researcher has an attempt to examine the common errors of undergraduate students in their academic writings at Pathein University. The purpose of doing this research is to investigate the errors which students usually commit in academic writing and to find out the better ways for correcting these errors in EFL classrooms. In this research, fifty-third-year non-English specialization students attending Pathein University were selected as participants. This research took one month. It was conducted with a mixed methodology method. Two mini-tests were used as research tools. Data were collected with a quantitative research method. Findings from this research pointed that most of the students noticed their common errors after getting the necessary input, and they became more decreased committing these errors after taking mini-test; hence, all findings will be supportive for further researches related to error analysis in academic writing.Keywords: academic writing, error analysis, EFL learners, mini-tests, mixed methodology
Procedia PDF Downloads 1326829 Criticality of Socio-Cultural Factors in Public Policy: A Study of Reproductive Health Care in Rural West Bengal
Authors: Arindam Roy
Abstract:
Public policy is an intriguing terrain, which involves complex interplay of administrative, social political and economic components. There is hardly any fit-for all formulation of public policy as Lindbloom has aptly categorized it as a science of muddling through. In fact, policies are both temporally and contextually determined as one the proponents of policy sciences Harold D Lasswell has underscored it in his ‘contextual-configurative analysis’ as early as 1950s. Though, a lot of theoretical efforts have been made to make sense of this intricate dynamics of policy making, at the end of the day the applied area of public policy negates any such uniform, planned and systematic formulation. However, our policy makers seem to have learnt very little of that. Until recently, policy making was deemed as an absolutely specialized exercise to be conducted by a cadre of professionally trained seasoned mandarin. Attributes like homogeneity, impartiality, efficiency, and neutrality were considered as the watchwords of delivering common goods. Citizen or clientele was conceptualized as universal political or economic construct, to be taken care of uniformly. Moreover, policy makers usually have the proclivity to put anything into straightjacket, and to ignore the nuances therein. Hence, least attention has been given to the ground level reality, especially the socio-cultural milieu where the policy is supposed to be applied. Consequently, a substantial amount of public money goes in vain as the intended beneficiaries remain indifferent to the delivery of public policies. The present paper in the light of Reproductive Health Care policy in rural West Bengal has tried to underscore the criticality of socio-cultural factors in public health delivery. Indian health sector has traversed a long way. From a near non-existent at the time of independence, the Indian state has gradually built a country-wide network of health infrastructure. Yet it has to make a major breakthrough in terms of coverage and penetration of the health services in the rural areas. Several factors are held responsible for such state of things. These include lack of proper infrastructure, medicine, communication, ambulatory services, doctors, nursing services and trained birth attendants. Policy makers have underlined the importance of supply side in policy formulation and implementation. The successive policy documents concerning health delivery bear the testimony of it. The present paper seeks to interrogate the supply-side oriented explanations for the failure of the delivery of health services. Instead, it identified demand side to find out the answer. The state-led and bureaucratically engineered public health measures fail to engender demands as these measures mostly ignore socio-cultural nuances of health and well-being. Hence, the hiatus between supply side and demand side leads to huge wastage of revenue as health infrastructure, medicine and instruments remain unutilized in most cases. Therefore, taking proper cognizance of these factors could have streamlined the delivery of public health.Keywords: context, policy, socio-cultural factor, uniformity
Procedia PDF Downloads 3166828 Densities and Viscosities of Binary Mixture Containing Diethylamine and 2-Alkanol
Authors: Elham jassemi Zargani, Mohammad almasi
Abstract:
Densities and viscosities for binary mixtures of diethylamine + 2 Alkanol (2 propanol up to 2 pentanol) were measured over the entire composition range and temperature interval of 293.15 to 323.15 K. Excess molar volumes V_m^E and viscosity deviations Δη were calculated and correlated by the Redlich−Kister type function to derive the coefficients and estimate the standard error. For mixtures of diethylamine with used 2-alkanols, V_m^E and Δη are negative over the entire range of mole fraction. The observed variations of these parameters, with alkanols chain length and temperature, are discussed in terms of the inter-molecular interactions between the unlike molecules of the binary mixtures.Keywords: densities, viscosities, diethylamine, 2-alkanol, Redlich-Kister
Procedia PDF Downloads 3886827 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance
Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan
Abstract:
Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 1666826 Teaching Creative Thinking and Writing to Simultaneous Bilinguals: A Longitudinal Study of 6-7 Years Old English and Punjabi Language Learners
Authors: Hafiz Muhammad Fazalehaq
Abstract:
This paper documents the results of a longitudinal study done on two bilingual children who speak English and Punjabi simultaneously. Their father is a native English speaker whereas their mother speaks Punjabi. Their mother can speak both the languages (English and Punjabi) whereas their father only speaks English. At the age of six, these children have difficulty in creative thinking and of course creative writing. So, the first task for the researcher is to impress and entice the children to think creatively. Various and different methodologies and techniques were used to entice them to start thinking creatively. Creative thinking leads to creative writing. These children were exposed to numerous sources including videos, photographs, texts and audios at first place in order to have a taste of creative genres (stories in this case). The children were encouraged to create their own stories sometimes with photographs and sometimes by using their favorite toys. At a second stage, they were asked to write about an event or incident. After that, they were motivated to create new stories and write them. Length of their creative writing varies from a few sentences to a two standard page. After this six months’ study, the researcher was able to develop a ten steps methodology for creating and improving/enhancing creative thinking and creative writing skills of the subjects understudy. This ten-step methodology entices and motivates the learner to think creatively for producing a creative piece.Keywords: bilinguals, creative thinking, creative writing, simultaneous bilingual
Procedia PDF Downloads 352