Search results for: linear multistep methods
16218 Examining Pre-Consumer Textile Waste Recycling, Barriers to Implementation, and Participant Demographics: A Review of Literature
Authors: Madeline W. Miller
Abstract:
The global textile industry produces pollutants in the form of liquid discharge, solid waste, and emissions into the natural environment. Textile waste resulting from garment production and other manufacturing processes makes a significant contribution to the amount of waste landfilled globally. While the majority of curbside and other convenient recycling methods cater to post-consumer paper and plastics, pre-consumer textile waste is often discarded with trash and is commonly classified as ‘other’ in municipal solid waste breakdowns. On a larger scale, many clothing manufacturers and other companies utilizing textiles have not yet identified or began using the most sustainable methods for discarding their post-industrial, pre-consumer waste. To lessen the amount of waste sent to landfills, there are post-industrial, pre-consumer textile waste recycling methods that can be used to give textiles a new life. This process requires that textile and garment manufacturers redirect their waste to companies that use industrial machinery to shred or fiberize these materials in preparation for their second life. The goal of this literature review is to identify the recycling and reuse challenges faced by producers within the clothing and textile industry that prevent these companies from utilizing the described recycling methods, causing them to opt for landfill. The literature analyzed in this review reflects manufacturer sentiments toward waste disposal and recycling. The results of this review indicate that the cost of logistics is the determining factor when it comes to companies recycling their pre-consumer textile waste and that the most applicable and successful textile waste recycling methods require a company separate from the manufacturer to account for waste production, provide receptacles for waste, arrange waste transport, and identify a secondary use for the material at a price-point below that of traditional waste disposal service.Keywords: leadership demographics, post-industrial textile waste, pre-consumer textile waste, industrial shoddy
Procedia PDF Downloads 15016217 Comparative Study Performance of the Induction Motor between SMC and NLC Modes Control
Authors: A. Oukaci, R. Toufouti, D. Dib, l. Atarsia
Abstract:
This article presents a multitude of alternative techniques to control the vector control, namely the nonlinear control and sliding mode control. Moreover, the implementation of their control law applied to the high-performance to the induction motor with the objective to improve the tracking control, ensure stability robustness to parameter variations and disturbance rejection. Tests are performed numerical simulations in the Matlab/Simulink interface, the results demonstrate the efficiency and dynamic performance of the proposed strategy.Keywords: Induction Motor (IM), Non-linear Control (NLC), Sliding Mode Control (SMC), nonlinear sliding surface
Procedia PDF Downloads 57216216 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals
Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor
Abstract:
This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers
Procedia PDF Downloads 7516215 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods
Authors: A. Senthil Kumar, V. Murali Bhaskaran
Abstract:
In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)
Procedia PDF Downloads 28616214 Comparison of Agree Method and Shortest Path Method for Determining the Flow Direction in Basin Morphometric Analysis: Case Study of Lower Tapi Basin, Western India
Authors: Jaypalsinh Parmar, Pintu Nakrani, Bhaumik Shah
Abstract:
Digital Elevation Model (DEM) is elevation data of the virtual grid on the ground. DEM can be used in application in GIS such as hydrological modelling, flood forecasting, morphometrical analysis and surveying etc.. For morphometrical analysis the stream flow network plays a very important role. DEM lacks accuracy and cannot match field data as it should for accurate results of morphometrical analysis. The present study focuses on comparing the Agree method and the conventional Shortest path method for finding out morphometric parameters in the flat region of the Lower Tapi Basin which is located in the western India. For the present study, open source SRTM (Shuttle Radar Topography Mission with 1 arc resolution) and toposheets issued by Survey of India (SOI) were used to determine the morphometric linear aspect such as stream order, number of stream, stream length, bifurcation ratio, mean stream length, mean bifurcation ratio, stream length ratio, length of overland flow, constant of channel maintenance and aerial aspect such as drainage density, stream frequency, drainage texture, form factor, circularity ratio, elongation ratio, shape factor and relief aspect such as relief ratio, gradient ratio and basin relief for 53 catchments of Lower Tapi Basin. Stream network was digitized from the available toposheets. Agree DEM was created by using the SRTM and stream network from the toposheets. The results obtained were used to demonstrate a comparison between the two methods in the flat areas.Keywords: agree method, morphometric analysis, lower Tapi basin, shortest path method
Procedia PDF Downloads 23916213 A Survey of Dynamic QoS Methods in Sofware Defined Networking
Authors: Vikram Kalekar
Abstract:
Modern Internet Protocol (IP) networks deploy traditional and modern Quality of Service (QoS) management methods to ensure the smooth flow of network packets during regular operations. SDN (Software-defined networking) networks have also made headway into better service delivery by means of novel QoS methodologies. While many of these techniques are experimental, some of them have been tested extensively in controlled environments, and few of them have the potential to be deployed widely in the industry. With this survey, we plan to analyze the approaches to QoS and resource allocation in SDN, and we will try to comment on the possible improvements to QoS management in the context of SDN.Keywords: QoS, policy, congestion, flow management, latency, delay index terms-SDN, delay
Procedia PDF Downloads 19316212 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia
Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp
Abstract:
Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop a context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is selected. Local partners are integrated into an organizational framework that executes major parts of this research endeavour in the Arsi Zone, Oromia Region, Ethiopia.Keywords: agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service
Procedia PDF Downloads 7316211 An Overview of Food Waste Management Technologies; The Advantages of Using New Management Methods over the Older Methods to Reduce the Environmental Impacts of Food Waste, Conserve Resources, and Energy Recovery
Authors: Bahareh Asefi, Fereidoun Farzaneh, Ghazaleh Asefi
Abstract:
Continuous increasing food waste produced on a global as well as national scale may lead to burgeoning environmental and economic problems. Simultaneously, decreasing the use efficiencies of natural resources such as land, water, and energy is occurring. On the other hand, food waste has a high-energy content, which seems ideal to achieve dual benefits in terms of energy recovery and the improvement of resource use efficiencies. Therefore, to decrease the environmental impacts of food waste and resource conservation, the researcher has focused on traditional methods of using food waste as a resource through different approaches such as anaerobic digestion, composting, incineration, and landfill. The adverse environmental effects of growing food waste make it difficult for traditional food waste treatment and management methods to balance social, economic, and environmental benefits. The old technology does not need to develop, but several new technologies such as microbial fuel cells, food waste disposal, and bio-converting food waste technology still need to establish or appropriately considered. It is pointed out that some new technologies can take into account various benefits. Since the information about food waste and its management method is critical for executable policy, a review of the latest information regarding the source of food waste and its management technology in some counties is provided in this study.Keywords: food waste, management technology, innovative method, bio converting food waste, microbial fuel cell
Procedia PDF Downloads 11616210 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research
Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner
Abstract:
Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study
Procedia PDF Downloads 10516209 Potential Energy Expectation Value for Lithium Excited State (1s2s3s)
Authors: Khalil H. Al-Bayati, G. Nasma, Hussein Ban H. Adel
Abstract:
The purpose of the present work is to calculate the expectation value of potential energyKeywords: lithium excited state, potential energy, 1s2s3s, mathematical physics
Procedia PDF Downloads 48916208 Multi-Criteria Decision Making Network Optimization for Green Supply Chains
Authors: Bandar A. Alkhayyal
Abstract:
Modern supply chains are typically linear, transforming virgin raw materials into products for end consumers, who then discard them after use to landfills or incinerators. Nowadays, there are major efforts underway to create a circular economy to reduce non-renewable resource use and waste. One important aspect of these efforts is the development of Green Supply Chain (GSC) systems which enables a reverse flow of used products from consumers back to manufacturers, where they can be refurbished or remanufactured, to both economic and environmental benefit. This paper develops novel multi-objective optimization models to inform GSC system design at multiple levels: (1) strategic planning of facility location and transportation logistics; (2) tactical planning of optimal pricing; and (3) policy planning to account for potential valuation of GSC emissions. First, physical linear programming was applied to evaluate GSC facility placement by determining the quantities of end-of-life products for transport from candidate collection centers to remanufacturing facilities while satisfying cost and capacity criteria. Second, disassembly and remanufacturing processes have received little attention in industrial engineering and process cost modeling literature. The increasing scale of remanufacturing operations, worth nearly $50 billion annually in the United States alone, have made GSC pricing an important subject of research. A non-linear physical programming model for optimization of pricing policy for remanufactured products that maximizes total profit and minimizes product recovery costs were examined and solved. Finally, a deterministic equilibrium model was used to determine the effects of internalizing a cost of GSC greenhouse gas (GHG) emissions into optimization models. Changes in optimal facility use, transportation logistics, and pricing/profit margins were all investigated against a variable cost of carbon, using case study system created based on actual data from sites in the Boston area. As carbon costs increase, the optimal GSC system undergoes several distinct shifts in topology as it seeks new cost-minimal configurations. A comprehensive study of quantitative evaluation and performance of the model has been done using orthogonal arrays. Results were compared to top-down estimates from economic input-output life cycle assessment (EIO-LCA) models, to contrast remanufacturing GHG emission quantities with those from original equipment manufacturing operations. Introducing a carbon cost of $40/t CO2e increases modeled remanufacturing costs by 2.7% but also increases original equipment costs by 2.3%. The assembled work advances the theoretical modeling of optimal GSC systems and presents a rare case study of remanufactured appliances.Keywords: circular economy, extended producer responsibility, greenhouse gas emissions, industrial ecology, low carbon logistics, green supply chains
Procedia PDF Downloads 16016207 An Approach to Capture, Evaluate and Handle Complexity of Engineering Change Occurrences in New Product Development
Authors: Mohammad Rostami Mehr, Seyed Arya Mir Rashed, Arndt Lueder, Magdalena Missler-Behr
Abstract:
This paper represents the conception that complex problems do not necessarily need a similar complex solution in order to cope with the complexity. Furthermore, a simple solution based on established methods can provide a sufficient way to deal with the complexity. To verify this conception, the presented paper focuses on the field of change management as a part of the new product development process in the automotive sector. In this field, dealing with increasing complexity is essential, while only non-flexible rigid processes that are not designed to handle complexity are available. The basic methodology of this paper can be divided into four main sections: 1) analyzing the complexity of the change management, 2) literature review in order to identify potential solutions and methods, 3) capturing and implementing expertise of experts from the change management field of an automobile manufacturing company and 4) systematical comparison of the identified methods from literature and connecting these with defined requirements of the complexity of the change management in order to develop a solution. As a practical outcome, this paper provides a method to capture the complexity of engineering changes (EC) and includes it within the EC evaluation process, following case-related process guidance to cope with the complexity. Furthermore, this approach supports the conception that dealing with complexity is possible while utilizing rather simple and established methods by combining them into a powerful tool.Keywords: complexity management, new product development, engineering change management, flexibility
Procedia PDF Downloads 19716206 Correlation of Unsuited and Suited 5ᵗʰ Female Hybrid III Anthropometric Test Device Model under Multi-Axial Simulated Orion Abort and Landing Conditions
Authors: Christian J. Kennett, Mark A. Baldwin
Abstract:
As several companies are working towards returning American astronauts back to space on US-made spacecraft, NASA developed a human flight certification-by-test and analysis approach due to the cost-prohibitive nature of extensive testing. This process relies heavily on the quality of analytical models to accurately predict crew injury potential specific to each spacecraft and under dynamic environments not tested. As the prime contractor on the Orion spacecraft, Lockheed Martin was tasked with quantifying the correlation of analytical anthropometric test devices (ATDs), also known as crash test dummies, against test measurements under representative impact conditions. Multiple dynamic impact sled tests were conducted to characterize Hybrid III 5th ATD lumbar, head, and neck responses with and without a modified shuttle-era advanced crew escape suit (ACES) under simulated Orion landing and abort conditions. Each ATD was restrained via a 5-point harness in a mockup Orion seat fixed to a dynamic impact sled at the Wright Patterson Air Force Base (WPAFB) Biodynamics Laboratory in the horizontal impact accelerator (HIA). ATDs were subject to multiple impact magnitudes, half-sine pulse rise times, and XZ - ‘eyeballs out/down’ or Z-axis ‘eyeballs down’ orientations for landing or an X-axis ‘eyeballs in’ orientation for abort. Several helmet constraint devices were evaluated during suited testing. Unique finite element models (FEMs) were developed of the unsuited and suited sled test configurations using an analytical 5th ATD model developed by LSTC (Livermore, CA) and deformable representations of the seat, suit, helmet constraint countermeasures, and body restraints. Explicit FE analyses were conducted using the non-linear solver LS-DYNA. Head linear and rotational acceleration, head rotational velocity, upper neck force and moment, and lumbar force time histories were compared between test and analysis using the enhanced error assessment of response time histories (EEARTH) composite score index. The EEARTH rating paired with the correlation and analysis (CORA) corridor rating provided a composite ISO score that was used to asses model correlation accuracy. NASA occupant protection subject matter experts established an ISO score of 0.5 or greater as the minimum expectation for correlating analytical and experimental ATD responses. Unsuited 5th ATD head X, Z, and resultant linear accelerations, head Y rotational accelerations and velocities, neck X and Z forces, and lumbar Z forces all showed consistent ISO scores above 0.5 in the XZ impact orientation, regardless of peak g-level or rise time. Upper neck Y moments were near or above the 0.5 score for most of the XZ cases. Similar trends were found in the XZ and Z-axis suited tests despite the addition of several different countermeasures for restraining the helmet. For the X-axis ‘eyeballs in’ loading direction, only resultant head linear acceleration and lumbar Z-axis force produced ISO scores above 0.5 whether unsuited or suited. The analytical LSTC 5th ATD model showed good correlation across multiple head, neck, and lumbar responses in both the unsuited and suited configurations when loaded in the XZ ‘eyeballs out/down’ direction. Upper neck moments were consistently the most difficult to predict, regardless of impact direction or test configuration.Keywords: impact biomechanics, manned spaceflight, model correlation, multi-axial loading
Procedia PDF Downloads 11416205 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods
Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno
Abstract:
Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management
Procedia PDF Downloads 49816204 The Effect of Simultaneous Application of Laser Beam and Magnet in Treatment of Intervertebral Disc Herniation
Authors: Alireza Moghtaderi, Negin Khakpour
Abstract:
Disc Herniation is a common complication in the society and it is one of the main reasons for referring to physical medicine and rehabilitation clinics. Despite of various methods proposed for treatingthis disease, still there is disagreement on success of these methods especially in non-surgical methods, and thus current study aims at determining effect of laser beam and magnet on treatment of Intervertebral Disc Herniation. During a clinical trial study, 80 patients with Intervertebral Disc Herniation underwent a combined package of treatment including magnet, laser beam, PRP and Prolotherapy during 6 months. Average age of patients was 51.25 ± 10.7 with range of 25 – 71 years. 30 men (37.5%) and 50 women (62.5%) took part in the study. average weight of patients was 64.3 ± 7.2 with range of 49 – 79 kg. highest level of Disc Herniation was L5 – S1 with frequency of 17 cases (21.3%). Disc Herniation was severe in 30 cases before treatment, but it reduced to 3 casesafter treatment. This study indicates effect of combined treatment using non-invasive laser beam and magnet therapy on disco genic diseases and mechanical pains of spine is highly effective.Keywords: hallux, valgus, botulinum toxin a, pain
Procedia PDF Downloads 9216203 Observed Changes in Constructed Precipitation at High Resolution in Southern Vietnam
Authors: Nguyen Tien Thanh, Günter Meon
Abstract:
Precipitation plays a key role in water cycle, defining the local climatic conditions and in ecosystem. It is also an important input parameter for water resources management and hydrologic models. With spatial continuous data, a certainty of discharge predictions or other environmental factors is unquestionably better than without. This is, however, not always willingly available to acquire for a small basin, especially for coastal region in Vietnam due to a low network of meteorological stations (30 stations) on long coast of 3260 km2. Furthermore, available gridded precipitation datasets are not fine enough when applying to hydrologic models. Under conditions of global warming, an application of spatial interpolation methods is a crucial for the climate change impact studies to obtain the spatial continuous data. In recent research projects, although some methods can perform better than others do, no methods draw the best results for all cases. The objective of this paper therefore, is to investigate different spatial interpolation methods for daily precipitation over a small basin (approximately 400 km2) located in coastal region, Southern Vietnam and find out the most efficient interpolation method on this catchment. The five different interpolation methods consisting of cressman, ordinary kriging, regression kriging, dual kriging and inverse distance weighting have been applied to identify the best method for the area of study on the spatio-temporal scale (daily, 10 km x 10 km). A 30-year precipitation database was created and merged into available gridded datasets. Finally, observed changes in constructed precipitation were performed. The results demonstrate that the method of ordinary kriging interpolation is an effective approach to analyze the daily precipitation. The mixed trends of increasing and decreasing monthly, seasonal and annual precipitation have documented at significant levels.Keywords: interpolation, precipitation, trend, vietnam
Procedia PDF Downloads 27516202 Eco-Design of Construction Industrial Park in China with Selection of Candidate Tenants
Authors: Yang Zhou, Kaijian Li, Guiwen Liu
Abstract:
Offsite construction is an innovative alternative to conventional site-based construction, with wide-ranging benefits. It requires building components, elements or modules were prefabricated and pre-assembly before installed into their final locations. To improve efficiency and achieve synergies, in recent years, construction companies were clustered into construction industrial parks (CIPs) in China. A CIP is a community of construction manufacturing and service businesses located together on a common property. Companies involved in industrial clusters can obtain environment and economic benefits by sharing resources and information in a given region. Therefore, the concept of industrial symbiosis (IS) can be applied to the traditional CIP to achieve sustainable industrial development or redevelopment through the implementation of eco-industrial parks (EIP). However, before designing a symbiosis network between companies in a CIP, candidate support tenants need to be selected to complement the existing construction companies. In this study, an access indicator system and a linear programming model are established to select candidate tenants in a CIP while satisfying the degree of connectivity among the enterprises in the CIP, minimizing the environmental impact, and maximizing the annualized profit of the CIP. The access indicator system comprises three primary indicators and fifteen secondary indicators, is proposed from the perspective of park-based level. The fifteen indicators are classified as three primary indicators including industrial symbiosis, environment performance and economic benefit, according to the three dimensions of sustainability (environment, economic and social dimensions) and the three R's of the environment (reduce, reuse and recycle). The linear programming model is a method to assess the satisfactoriness of all the indicators and to make an optimal multi-objective selection among candidate tenants. This method provides a practical tool for planners of a CIP in evaluating which among the candidate tenants would best complement existing anchor construction tenants. The reasonability and validity of the indicator system and the method is worth further study in the future.Keywords: construction industrial park, China, industrial symbiosis, offsite construction, selection of support tenants
Procedia PDF Downloads 27416201 Students’ and Clinical Supervisors’ Experiences of Occupational Therapy Practice Education: A Structured Critical Review
Authors: Hamad Alhamad, Catriona Khamisha, Emma Green, Yvonne Robb
Abstract:
Introduction: Practice education is a key component of occupational therapy education. This critical review aimed to explore students’ and clinical supervisors’ experiences of practice education, and to make recommendations for research. Method: The literature was systematically searched using five databases. Qualitative, quantitative and mixed methods studies were included. Critical Appraisal Skills Programme checklist for qualitative studies and Mixed Methods Assessment Tool for quantitative and mixed methods studies were used to assess study quality. Findings: Twenty-two studies with high quality scores were included: 16 qualitative, 3 quantitative and 3 mixed methods. Studies were conducted in Australia, Canada, USA and UK. During practice education, students learned professional skills, practical skills, clinical skills and problem-solving skills, and improved confidence and creativity. Supervisors had an opportunity to reflect on their practice and get experience of supervising students. However, clear objectives and expectations for students, and sufficient theoretical knowledge, preparation and resources for supervisors were required. Conclusion: Practice education provides different skills and experiences, necessary to become competent professionals; but some areas of practice education need to improve. Studies in non-western countries are needed to explore the perspectives of students and clinical supervisors in different cultures, to ensure the practice education models adopted are relevant.Keywords: occupational therapy, practice education, fieldwork, students, clinical supervisors
Procedia PDF Downloads 20316200 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016
Authors: Dimitra Alexiou
Abstract:
During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy
Procedia PDF Downloads 26516199 Life Stories: High Quality of Life until the End with the Narrative Medicine and the Storytelling
Authors: Danila Zuffetti, Lorenzo Chiesa
Abstract:
Background: A hospice narrative interview aims at putting the sick at the core of disease and treatment allowing them to explore their most intimate facets. The aim of this work is to favor authentic narration by leading towards awareness and acceptance of terminality and to face death with serenity. Narration in palliative care aims at helping to reduce the chaos generated by the disease and to elaborate interpretations on the course of reality, besides, the narration delivered to the doctor is fundamental and communicates the meaning given to symptoms. Methods: The narrative interview has become a regular activity in the Castellini Foundation since 2017. Patients take part every week, and for more days, in one hour sessions, in a welcoming and empathic setting and the interaction with the operator leads to a gradual awareness of their terminality. Patients are submitted with free answer questions with the purpose of facilitating and stimulating self-narration. Narration has not always been linear, but patients are left free to shift in time to revisit their disease process by making use of different tools, such as digital storytelling. Results: The answers provided by the patients show to which extent the narrative interview is an instrument allowing the analysis of the stories and gives the possibility to better understand and deepen the different implications of patient and caregiver’s background. Conclusion: The narration work in the hospice demonstrates that narrative medicine is an added value. This instrument has proven useful not only in the support of patients but also for the palliative doctor to identify wishes for accompanying them to the end with dignity and serenity. The narrative interview favors the construction of an authentic therapeutic relationship. The sick are taken wholly in charge, and they are guaranteed a high quality of life until their very last instant.Keywords: construction of an authentic therapy relationship, gradual awareness of their terminality, narrative interview, reduce the chaos generated by the desease
Procedia PDF Downloads 17516198 Two Dimensional Steady State Modeling of Temperature Profile and Heat Transfer of Electrohydrodynamically Enhanced Micro Heat Pipe
Authors: H. Shokouhmand, M. Tajerian
Abstract:
A numerical investigation of laminar forced convection flows through a square cross section micro heat pipe by applying electrohydrodynamic (EHD) field has been studied. In the present study, pentane is selected as working fluid. Temperature and velocity profiles and heat transfer enhancement in the micro heat pipe by using EHD field at the two-dimensional and single phase fluid flow in steady state regime have been numerically calculated. At this model, only Coulomb force is considered. The study has been carried out for the Reynolds number 10 to 100 and EHD force field up to 8 KV. Coupled, non-linear equations governed on the model (continuity, momentum, and energy equations) have been solved simultaneously by CFD numerical methods. Steady state behavior of affecting parameters, e.g. friction factor, average temperature, Nusselt number and heat transfer enhancement criteria, have been evaluated. It has been observed that by increasing Reynolds number, the effect of EHD force became more significant and for smaller Reynolds numbers the rate of heat transfer enhancement criteria is increased. By obtaining and plotting the mentioned parameters, it has been shown that the EHD field enhances the heat transfer process. The numerical results show that by increasing EHD force field the absolute value of Nusselt number and friction factor increases and average temperature of fluid flow decreases. But the increasing rate of Nusselt number is greater than increasing value of friction factor, which makes applying EHD force field for heat transfer enhancement in micro heat pipes acceptable and applicable. The numerical results of model are in good agreement with the experimental results available in the literature.Keywords: micro heat pipe, electrohydrodynamic force, Nusselt number, average temperature, friction factor
Procedia PDF Downloads 27116197 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods
Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard
Abstract:
The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.Keywords: algorithms, genetics, matching, population
Procedia PDF Downloads 14316196 Bacteriological Culture Methods and its Uses in Clinical Pathology
Authors: Prachi Choudhary, Jai Gopal Sharma
Abstract:
Microbial cultures determine the type of organism, its abundance in the tested sample, or both. It is one of the primary diagnostic methods of microbiology. It is used to determine the cause of infectious disease by letting the agent multiply in a predetermined medium. Different bacteria produce colonies that may be very distinct from the bacterial species that produced them. To culture any pathogen or microorganism, we should first know about the types of media used in microbiology for culturing. Sometimes sub culturing is also done in various microorganisms if some mixed growth is seen in culture. Nearly 3 types of culture media based on consistency – solid, semi-solid, and liquid (broth) media; are further explained in the report. Then, The Five I's approach is a method for locating, growing, observing, and characterizing microorganisms, including inoculation and incubation. Isolation, inspection, and identification. For identification of bacteria, we have to culture the sample like urine, sputum, blood, etc., on suitable media; there are different methods of culturing the bacteria or microbe like pour plate method, streak plate method, swabbing by needle, pipetting, inoculation by loop, spreading by spreader, etc. After this, we see the bacterial growth after incubation of 24 hours, then according to the growth of bacteria antibiotics susceptibility test is conducted; this is done for sensitive antibiotics or resistance to that bacteria, and also for knowing the name of bacteria. Various methods like the dilution method, disk diffusion method, E test, etc., do antibiotics susceptibility tests. After that, various medicines are provided to the patients according to antibiotic sensitivity and resistance.Keywords: inoculation, incubation, isolation, antibiotics suspectibility test, characterizing
Procedia PDF Downloads 8216195 Analysis of Non-Coding Genome in Streptococcus pneumoniae for Molecular Epidemiology Typing
Authors: Martynova Alina, Lyubov Buzoleva
Abstract:
Streptococcus pneumoniae is the causative agent of pneumonias and meningitids throught all the world. Having high genetic diversity, this microorganism can cause different clinical forms of pneumococcal infections and microbiologically it is really difficult diagnosed by routine methods. Also, epidemiological surveillance requires more developed methods of molecular typing because the recent method of serotyping doesn't allow to distinguish invasive and non-invasive isolates properly. Non-coding genome of bacteria seems to be the interesting source for seeking of highly distinguishable markers to discriminate the subspecies of such a variable bacteria as Streptococcus pneumoniae. Technically, we proposed scheme of discrimination of S.pneumoniae strains with amplification of non-coding region (SP_1932) with the following restriction with 2 types of enzymes of Alu1 and Mn1. Aim: This research aimed to compare different methods of typing and their application for molecular epidemiology purposes. Methods: we analyzed population of 100 strains of S.pneumoniae isolated from different patients by different molecular epidemiology methods such as pulse-field gel electophoresis (PFGE), restriction polymorphism analysis (RFLP) and multilolocus sequence typing (MLST), and all of them were compared with classic typing method as serotyping. The discriminative power was estimated with Simpson Index (SI). Results: We revealed that the most discriminative typing method is RFLP (SI=0,97, there were distinguished 42 genotypes).PFGE was slightly less discriminative (SI=0,95, we identified 35 genotypes). MLST is still the best reference method (SI=1.0). Classic method of serotyping showed quite weak discriminative power (SI=0,93, 24 genotypes). In addition, sensivity of RFLP was 100%, specificity was 97,09%. Conclusion: the most appropriate method for routine epidemiology surveillance is RFLP with non-coding region of Streptococcsu pneumoniae, then PFGE, though in some cases these results should be obligatory confirmed by MLST.Keywords: molecular epidemiology typing, non-coding genome, Streptococcus pneumoniae, MLST
Procedia PDF Downloads 39916194 Multilabel Classification with Neural Network Ensemble Method
Authors: Sezin Ekşioğlu
Abstract:
Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.Keywords: multilabel, classification, neural network, KNN
Procedia PDF Downloads 15516193 Flicker Detection with Motion Tolerance for Embedded Camera
Authors: Jianrong Wu, Xuan Fu, Akihiro Higashi, Zhiming Tan
Abstract:
CMOS image sensors with a rolling shutter are used broadly in the digital cameras embedded in mobile devices. The rolling shutter suffers the flicker artifacts from the fluorescent lamp, and it could be observed easily. In this paper, the characteristics of illumination flicker in motion case were analyzed, and two efficient detection methods based on matching fragment selection were proposed. According to the experimental results, our methods could achieve as high as 100% accuracy in static scene, and at least 97% in motion scene.Keywords: illumination flicker, embedded camera, rolling shutter, detection
Procedia PDF Downloads 42016192 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE
Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao
Abstract:
For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE
Procedia PDF Downloads 17816191 A Quantitative Study Investigating Whether the Internalisation of Adolescent Femininity Ideologies Predicts Depression and Anxiety in Female Adolescents
Authors: Tondani Mudau, Sherine B. Van Wyk, Zuhayr Kafaar, Janan Dietrich
Abstract:
Female adolescents residing in a patriarchal society such as South Africa are more inclined to embrace feminine ideologies. Internalizing these ideologies may expose female adolescents to mental health challenges such as depression and anxiety. This study explored whether the internalisation of adolescent femininity ideologies namely, objectified relationship with own body (ORB) and inauthentic self in relationships (ISR) predicted anxiety and depression in late female adolescents at Stellenbosch University. The sample of the study consisted of 1451 (18-24) female undergraduate and postgraduate students enrolled at Stellenbosch University. The mean age of the participants was 20 (SD=1.46), and most participants (39.7%) were first-year students. The study employed a cross-sectional quantitative research design. Data was collected through an online self-completion survey, the survey consisted of three sections, the first section asked biographical questions regarding age, gender, race and family background. The second section measured the internalisation of feminine ideologies by using the adolescent femininity ideology scale which has two subscales namely inauthentic self in relationship with others (ISR) and objectified relationship with one’s own body (ORB). The ISR scale had the Cronbach Alpha of 0.76, and the ORB scale had the Cronbach Alpha of 0.83. The third section measured mental health (depression and anxiety) by using the Hopkins Symptoms 25-checklist which had the Cronbach Alpha of 0.93. Data were analysed through multiple linear regression from IBM SPSS (Statistical Package for the Social Sciences Version 24). The overall results of the multiple linear regression showed that The AFIS combination accounted for 14% for anxiety as measured by the Hopkins Symptoms Checklist R² = .142, F (2, 682) = 56.431, p < .001. The combination also accounted for 24% for depression as measured by the Hopkins Symptoms Checklist R² = .239, F (2, 682) = 106.971, p < .0. The findings in this study affirm the objectification and feminist theory contentions that internalising femininity ideologies (ISR and ORB) predict negative mental health in female adolescents.Keywords: adolescents, anxiety, depression, feminine ideologies, inauthentic self, mental health, self-objectification, South Africa
Procedia PDF Downloads 15116190 Fuglede-Putnam Theorem for ∗-Class A Operators
Authors: Mohammed Husein Mohammad Rashid
Abstract:
For a bounded linear operator T acting on a complex infinite dimensional Hilbert space ℋ, we say that T is ∗-class A operator (abbreviation T∈A*) if |T²|≥ |T*|². In this article, we prove the following assertions:(i) we establish some conditions which imply the normality of ∗-class A; (ii) we consider ∗-class A operator T ∈ ℬ(ℋ) with reducing kernel such that TX = XS for some X ∈ ℬ(K, ℋ) and prove the Fuglede-Putnam type theorem when adjoint of S ∈ ℬ(K) is dominant operators; (iii) furthermore, we extend the asymmetric Putnam-Fuglede theorem the class of ∗-class A operators.Keywords: fuglede-putnam theorem, normal operators, ∗-class a operators, dominant operators
Procedia PDF Downloads 8816189 Modification of Underwood's Equation to Calculate Minimum Reflux Ratio for Column with One Side Stream Upper Than Feed
Authors: S. Mousavian, A. Abedianpour, A. Khanmohammadi, S. Hematian, Gh. Eidi Veisi
Abstract:
Distillation is one of the most important and utilized separation methods in the industrial practice. There are different ways to design of distillation column. One of these ways is short cut method. In short cut method, material balance and equilibrium are employed to calculate number of tray in distillation column. There are different methods that are classified in short cut method. One of these methods is Fenske-Underwood-Gilliland method. In this method, minimum reflux ratio should be calculated by underwood equation. Underwood proposed an equation that is useful for simple distillation column with one feed and one top and bottom product. In this study, underwood method is developed to predict minimum reflux ratio for column with one side stream upper than feed. The result of this model compared with McCabe-Thiele method. The result shows that proposed method able to calculate minimum reflux ratio with very small error.Keywords: minimum reflux ratio, side stream, distillation, Underwood’s method
Procedia PDF Downloads 406