Search results for: order-to-delivery lead time (ODLT)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20831

Search results for: order-to-delivery lead time (ODLT)

19961 Structured Cross System Planning and Control in Modular Production Systems by Using Agent-Based Control Loops

Authors: Simon Komesker, Achim Wagner, Martin Ruskowski

Abstract:

In times of volatile markets with fluctuating demand and the uncertainty of global supply chains, flexible production systems are the key to an efficient implementation of a desired production program. In this publication, the authors present a holistic information concept taking into account various influencing factors for operating towards the global optimum. Therefore, a strategy for the implementation of multi-level planning for a flexible, reconfigurable production system with an alternative production concept in the automotive industry is developed. The main contribution of this work is a system structure mixing central and decentral planning and control evaluated in a simulation framework. The information system structure in current production systems in the automotive industry is rigidly hierarchically organized in monolithic systems. The production program is created rule-based with the premise of achieving uniform cycle time. This program then provides the information basis for execution in subsystems at the station and process execution level. In today's era of mixed-(car-)model factories, complex conditions and conflicts arise in achieving logistics, quality, and production goals. There is no provision for feedback loops of results from the process execution level (resources) and process supporting (quality and logistics) systems and reconsideration in the planning systems. To enable a robust production flow, the complexity of production system control is artificially reduced by the line structure and results, for example in material-intensive processes (buffers and safety stocks - two container principle also for different variants). The limited degrees of freedom of line production have produced the principle of progress figure control, which results in one-time sequencing, sequential order release, and relatively inflexible capacity control. As a result, modularly structured production systems such as modular production according to known approaches with more degrees of freedom are currently difficult to represent in terms of information technology. The remedy is an information concept that supports cross-system and cross-level information processing for centralized and decentralized decision-making. Through an architecture of hierarchically organized but decoupled subsystems, the paradigm of hybrid control is used, and a holonic manufacturing system is offered, which enables flexible information provisioning and processing support. In this way, the influences from quality, logistics, and production processes can be linked holistically with the advantages of mixed centralized and decentralized planning and control. Modular production systems also require modularly networked information systems with semi-autonomous optimization for a robust production flow. Dynamic prioritization of different key figures between subsystems should lead the production system to an overall optimum. The tasks and goals of quality, logistics, process, resource, and product areas in a cyber-physical production system are designed as an interconnected multi-agent-system. The result is an alternative system structure that executes centralized process planning and decentralized processing. An agent-based manufacturing control is used to enable different flexibility and reconfigurability states and manufacturing strategies in order to find optimal partial solutions of subsystems, that lead to a near global optimum for hybrid planning. This allows a robust near to plan execution with integrated quality control and intralogistics.

Keywords: holonic manufacturing system, modular production system, planning, and control, system structure

Procedia PDF Downloads 159
19960 Continuous Blood Pressure Measurement from Pulse Transit Time Techniques

Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen

Abstract:

Pulse Blood pressure (BP) is one of the vital signs, and is an index that helps determining the stability of life. In this respect, some spinal cord injury patients need to take the tilt table test. While doing the test, the posture changes abruptly, and may cause a patient’s BP to change abnormally. This may cause patients to feel discomfort, and even feel as though their life is threatened. Therefore, if a continuous non-invasive BP assessment system were built, it could help to alert health care professionals in the process of rehabilitation when the BP value is out of range. In our research, BP assessed by the pulse transit time technique was developed. In the system, we use a self-made photoplethysmograph (PPG) sensor and filter circuit to detect two PPG signals and to calculate the time difference. The BP can immediately be assessed by the trend line. According to the results of this study, the relationship between the systolic BP and PTT has a highly negative linear correlation (R2=0.8). Further, we used the trend line to assess the value of the BP and compared it to a commercial sphygmomanometer (Omron MX3); the error rate of the system was found to be in the range of ±10%, which is within the permissible error range of a commercial sphygmomanometer. The continue blood pressure measurement from pulse transit time technique may have potential to become a convenience method for clinical rehabilitation.

Keywords: continous blood pressure measurement, PPG, time transit time, transit velocity

Procedia PDF Downloads 332
19959 Activation of Google Classroom Features to Engage Introvert Students in Comprehensible Output

Authors: Raghad Dwaik

Abstract:

It is well known in language acquisition literature that a mere understanding of a reading text is not enough to help students build proficiency in comprehension. Students should rather follow understanding by attempting to express what has been understood by pushing their competence to the limit. Learners' attempt to push their competence was given the term "comprehensible output" by Swain (1985). Teachers in large classes, however, find it sometimes difficult to give all students a chance to communicate their views or to share their ideas during the short class time. In most cases, students who are outgoing dominate class discussion and get more opportunities for practice which leads to ignoring the shy students totally while helping the good ones become better. This paper presents the idea of using Google Classroom features of posting and commenting to allow students who hesitate to participate in class discussions about a reading text to write their views on the wall of a Google Classroom and share them later after they have received feedback and comments from classmates. Such attempts lead to developing their proficiency through additional practice in comprehensible output and to enhancing their confidence in themselves and their views. It was found that virtual classroom interaction would help students maintain vocabulary, use more complex structures and focus on meaning besides form.

Keywords: learning groups, reading TESOL, Google Classroom, comprehensible output

Procedia PDF Downloads 57
19958 Virtualization and Visualization Based Driver Configuration in Operating System

Authors: Pavan Shah

Abstract:

In an Embedded system, Virtualization and visualization technology can provide us an effective response and measurable work in a software development environment. In addition to work of virtualization and virtualization can be easily deserved to provide the best resource sharing between real-time hardware applications and a healthy environment. However, the virtualization is noticeable work to minimize the I/O work and utilize virtualization & virtualization technology for either a software development environment (SDE) or a runtime environment of real-time embedded systems (RTMES) or real-time operating system (RTOS) eras. In this Paper, we particularly focus on virtualization and visualization overheads data of network which generates the I/O and implementation of standardized I/O (i.e., Virto), which can work as front-end network driver in a real-time operating system (RTOS) hardware module. Even there have been several work studies are available based on the virtualization operating system environment, but for the Virto on a general-purpose OS, my implementation is on the open-source Virto for a real-time operating system (RTOS). In this paper, the measurement results show that implementation which can improve the bandwidth and latency of memory management of the real-time operating system environment (RTMES) for getting more accuracy of the trained model.

Keywords: virtualization, visualization, network driver, operating system

Procedia PDF Downloads 116
19957 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 130
19956 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data

Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda

Abstract:

Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.

Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation

Procedia PDF Downloads 280
19955 Development of Ketorolac Tromethamine Encapsulated Stealth Liposomes: Pharmacokinetics and Bio Distribution

Authors: Yasmin Begum Mohammed

Abstract:

Ketorolac tromethamine (KTM) is a non-steroidal anti-inflammatory drug with a potent analgesic and anti-inflammatory activity due to prostaglandin related inhibitory effect of drug. It is a non-selective cyclo-oxygenase inhibitor. The drug is currently used orally and intramuscularly in multiple divided doses, clinically for the management arthritis, cancer pain, post-surgical pain, and in the treatment of migraine pain. KTM has short biological half-life of 4 to 6 hours, which necessitates frequent dosing to retain the action. The frequent occurrence of gastrointestinal bleeding, perforation, peptic ulceration, and renal failure lead to the development of other drug delivery strategies for the appropriate delivery of KTM. The ideal solution would be to target the drug only to the cells or tissues affected by the disease. Drug targeting could be achieved effectively by liposomes that are biocompatible and biodegradable. The aim of the study was to develop a parenteral liposome formulation of KTM with improved efficacy while reducing side effects by targeting the inflammation due to arthritis. PEG-anchored (stealth) and non-PEG-anchored liposomes were prepared by thin film hydration technique followed by extrusion cycle and characterized for in vitro and in vivo. Stealth liposomes (SLs) exhibited increase in percent encapsulation efficiency (94%) and 52% percent of drug retention during release studies in 24 h with good stability for a period of 1 month at -20°C and 4°C. SLs showed about maximum 55% of edema inhibition with significant analgesic effect. SLs produced marked differences over those of non-SL formulations with an increase in area under plasma concentration time curve, t₁/₂, mean residence time, and reduced clearance. 0.3% of the drug was detected in arthritic induced paw with significantly reduced drug localization in liver, spleen, and kidney for SLs when compared to other conventional liposomes. Thus SLs help to increase the therapeutic efficacy of KTM by increasing the targeting potential at the inflammatory region.

Keywords: biodistribution, ketorolac tromethamine, stealth liposomes, thin film hydration technique

Procedia PDF Downloads 278
19954 Timing Equation for Capturing Satellite Thermal Images

Authors: Toufic Abd El-Latif Sadek

Abstract:

The Asphalt object represents the asphalted areas, like roads. The best original data of thermal images occurred at a specific time during the days of the year, by preventing the gaps in times which give the close and same brightness from different objects, using seven sample objects, asphalt, concrete, metal, rock, dry soil, vegetation, and water. It has been found in this study a general timing equation for capturing satellite thermal images at different locations, depends on a fixed time the sunrise and sunset; Capture Time= Tcap =(TM*TSR) ±TS.

Keywords: asphalt, satellite, thermal images, timing equation

Procedia PDF Downloads 331
19953 Digital Retinal Images: Background and Damaged Areas Segmentation

Authors: Eman A. Gani, Loay E. George, Faisel G. Mohammed, Kamal H. Sager

Abstract:

Digital retinal images are more appropriate for automatic screening of diabetic retinopathy systems. Unfortunately, a significant percentage of these images are poor quality that hinders further analysis due to many factors (such as patient movement, inadequate or non-uniform illumination, acquisition angle and retinal pigmentation). The retinal images of poor quality need to be enhanced before the extraction of features and abnormalities. So, the segmentation of retinal image is essential for this purpose, the segmentation is employed to smooth and strengthen image by separating the background and damaged areas from the overall image thus resulting in retinal image enhancement and less processing time. In this paper, methods for segmenting colored retinal image are proposed to improve the quality of retinal image diagnosis. The methods generate two segmentation masks; i.e., background segmentation mask for extracting the background area and poor quality mask for removing the noisy areas from the retinal image. The standard retinal image databases DIARETDB0, DIARETDB1, STARE, DRIVE and some images obtained from ophthalmologists have been used to test the validation of the proposed segmentation technique. Experimental results indicate the introduced methods are effective and can lead to high segmentation accuracy.

Keywords: retinal images, fundus images, diabetic retinopathy, background segmentation, damaged areas segmentation

Procedia PDF Downloads 384
19952 Simulation Analysis of Wavelength/Time/Space Codes Using CSRZ and DPSK-RZ Formats for Fiber-Optic CDMA Systems

Authors: Jaswinder Singh

Abstract:

In this paper, comparative analysis is carried out to study the performance of wavelength/time/space optical CDMA codes using two well-known formats; those are CSRZ and DPSK-RZ using RSoft’s OptSIM. The analysis is carried out under the real-like scenario considering the presence of various non-linear effects such as XPM, SPM, SRS, SBS and FWM. Fiber dispersion and the multiple access interference are also considered. The codes used in this analysis are 3-D wavelength/time/space codes. These are converted into 2-D wavelength-time codes so that their requirement of space couplers and fiber ribbons is eliminated. Under the conditions simulated, this is found that CSRZ performs better than DPSK-RZ for fiber-optic CDMA applications.

Keywords: Optical CDMA, Multiple access interference (MAI), CSRZ, DPSK-RZ

Procedia PDF Downloads 628
19951 Real-Time Control of Grid-Connected Inverter Based on labVIEW

Authors: L. Benbaouche, H. E. , F. Krim

Abstract:

In this paper we propose real-time control of grid-connected single phase inverter, which is flexible and efficient. The first step is devoted to the study and design of the controller through simulation, conducted by the LabVIEW software on the computer 'host'. The second step is running the application from PXI 'target'. LabVIEW software, combined with NI-DAQmx, gives the tools to easily build applications using the digital to analog converter to generate the PWM control signals. Experimental results show that the effectiveness of LabVIEW software applied to power electronics.

Keywords: real-time control, labview, inverter, PWM

Procedia PDF Downloads 486
19950 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow

Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam

Abstract:

Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.

Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety

Procedia PDF Downloads 277
19949 Bivariate Time-to-Event Analysis with Copula-Based Cox Regression

Authors: Duhania O. Mahara, Santi W. Purnami, Aulia N. Fitria, Merissa N. Z. Wirontono, Revina Musfiroh, Shofi Andari, Sagiran Sagiran, Estiana Khoirunnisa, Wahyudi Widada

Abstract:

For assessing interventions in numerous disease areas, the use of multiple time-to-event outcomes is common. An individual might experience two different events called bivariate time-to-event data, the events may be correlated because it come from the same subject and also influenced by individual characteristics. The bivariate time-to-event case can be applied by copula-based bivariate Cox survival model, using the Clayton and Frank copulas to analyze the dependence structure of each event and also the covariates effect. By applying this method to modeling the recurrent event infection of hemodialysis insertion on chronic kidney disease (CKD) patients, from the AIC and BIC values we find that the Clayton copula model was the best model with Kendall’s Tau is (τ=0,02).

Keywords: bivariate cox, bivariate event, copula function, survival copula

Procedia PDF Downloads 60
19948 Application of a SubIval Numerical Solver for Fractional Circuits

Authors: Marcin Sowa

Abstract:

The paper discusses the subinterval-based numerical method for fractional derivative computations. It is now referred to by its acronym – SubIval. The basis of the method is briefly recalled. The ability of the method to be applied in time stepping solvers is discussed. The possibility of implementing a time step size adaptive solver is also mentioned. The solver is tested on a transient circuit example. In order to display the accuracy of the solver – the results have been compared with those obtained by means of a semi-analytical method called gcdAlpha. The time step size adaptive solver applying SubIval has been proven to be very accurate as the results are very close to the referential solution. The solver is currently able to solve FDE (fractional differential equations) with various derivative orders for each equation and any type of source time functions.

Keywords: numerical method, SubIval, fractional calculus, numerical solver, circuit analysis

Procedia PDF Downloads 188
19947 Old Swimmers Tire Quickly: The Effect of Time on Quality of Thawed versus Washed Sperm

Authors: Emily Hamilton, Adiel Kahana, Ron Hauser, Shimi Barda

Abstract:

BACKGROUND: In the male fertility and sperm bank unit of Tel Aviv Sourasky medical center, women are treated with intrauterine insemination (IUI) using washed sperm from their partner or thawed sperm from a selected donor. In most cases, the women perform the IUI treatment in Sourasky, but sometimes they ask to undergo the insemination procedure in another clinic with their own fertility doctor. In these cases, the sperm sample is prepared at the Sourasky lab and the patient is inseminated after arriving to her doctor. Our laboratory has previously found that time negatively affects several parameters of thawed sperm, and we estimate that it has more severe and significant effect than on washed sperm. AIM: To examine the effect of time on the quality of washed sperm versus thawed sperm. METHODS: Sperm samples were collected from men referred for semen analysis. Each ejaculate was allowed to liquefy for at least 20 min at 37°C and analyzed for sperm motility and vitality percentage and DNA fragmentation index (Time 0). Subsequently, 1ml of the sample was divided into two parts, 1st part was washed only and the 2nd part was washed, frozen and thawed. Time 1 analysis occurred immediately after sperm washing or thawing. Time 2 analysis occurred 75 minutes after time 1. Statistical analysis was performed using Student t-test. P values<0.05 were considered significant. RESULTS: Preliminary data showed that time had a greater impact on the average percentages of sperm motility and vitality in thawed compared to washed sperm samples (26%±10% vs. 21%±10% and 21%±9% vs. 9%±10%, respectively). An additional trend towards increased average DNA fragmentation percentage in thawed samples compared to washed samples was observed (46%±18% vs. 25%±24%). CONCLUSION: Time negatively effects sperm quality. The effect is greater in thawed samples compared to fresh samples.

Keywords: ART, male fertility, sperm cryopreservation, sperm quality

Procedia PDF Downloads 177
19946 Feasibility Studies on the Removal of Fluoride from Aqueous Solution by Adsorption Using Agro-Based Waste Materials

Authors: G. Anusha, J. Raja Murugadoss

Abstract:

In recent years, the problem of water contaminant is drastically increasing due to the disposal of industrial wastewater containing iron, fluoride, mercury, lead, cadmium, phosphorus, silver etc. into water bodies. The non-biodegradable heavy metals could accumulate in the human system through food chain and cause various dreadful diseases and permanent disabilities and in worst cases it leads to casual losses. Further, the presence of the excess quantity of such heavy metals viz. Lead, Cadmium, Chromium, Nickel, Zinc, Copper, Iron etc. seriously affect the natural quality of potable water and necessitates the treatment process for removal. Though there are dozens of standard procedures available for the removal of heavy metals, their cost keeps the industrialists away from adopting such technologies. In the present work, an attempt has been made to remove such contaminants particularly fluoride and to study the efficiency of the removal of fluoride by adsorption using a new agro-based materials namely Limonia acidissima and Emblica officinalis which is commonly referred as wood apple and gooseberry respectively. Accordingly a set of experiments has been conducted using batch and column processes, with the help of activated carbon prepared from the shell of wood apple and seeds of gooseberries. Experiments reveal that the adsorption capacity of the shell of wood apple is significant to yield promising solutions.

Keywords: adsorption, fluoride, agro-based waste materials, Limonia acidissima, Emblica officinalis

Procedia PDF Downloads 414
19945 Strategies for Synchronizing Chocolate Conching Data Using Dynamic Time Warping

Authors: Fernanda A. P. Peres, Thiago N. Peres, Flavio S. Fogliatto, Michel J. Anzanello

Abstract:

Batch processes are widely used in food industry and have an important role in the production of high added value products, such as chocolate. Process performance is usually described by variables that are monitored as the batch progresses. Data arising from these processes are likely to display a strong correlation-autocorrelation structure, and are usually monitored using control charts based on multiway principal components analysis (MPCA). Process control of a new batch is carried out comparing the trajectories of its relevant process variables with those in a reference set of batches that yielded products within specifications; it is clear that proper determination of the reference set is key for the success of a correct signalization of non-conforming batches in such quality control schemes. In chocolate manufacturing, misclassifications of non-conforming batches in the conching phase may lead to significant financial losses. In such context, the accuracy of process control grows in relevance. In addition to that, the main assumption in MPCA-based monitoring strategies is that all batches are synchronized in duration, both the new batch being monitored and those in the reference set. Such assumption is often not satisfied in chocolate manufacturing process. As a consequence, traditional techniques as MPCA-based charts are not suitable for process control and monitoring. To address that issue, the objective of this work is to compare the performance of three dynamic time warping (DTW) methods in the alignment and synchronization of chocolate conching process variables’ trajectories, aimed at properly determining the reference distribution for multivariate statistical process control. The power of classification of batches in two categories (conforming and non-conforming) was evaluated using the k-nearest neighbor (KNN) algorithm. Real data from a milk chocolate conching process was collected and the following variables were monitored over time: frequency of soybean lecithin dosage, rotation speed of the shovels, current of the main motor of the conche, and chocolate temperature. A set of 62 batches with durations between 495 and 1,170 minutes was considered; 53% of the batches were known to be conforming based on lab test results and experts’ evaluations. Results showed that all three DTW methods tested were able to align and synchronize the conching dataset. However, synchronized datasets obtained from these methods performed differently when inputted in the KNN classification algorithm. Kassidas, MacGregor and Taylor’s (named KMT) method was deemed the best DTW method for aligning and synchronizing a milk chocolate conching dataset, presenting 93.7% accuracy, 97.2% sensitivity and 90.3% specificity in batch classification, being considered the best option to determine the reference set for the milk chocolate dataset. Such method was recommended due to the lowest number of iterations required to achieve convergence and highest average accuracy in the testing portion using the KNN classification technique.

Keywords: batch process monitoring, chocolate conching, dynamic time warping, reference set distribution, variable duration

Procedia PDF Downloads 152
19944 Approximation of the Time Series by Fractal Brownian Motion

Authors: Valeria Bondarenko

Abstract:

In this paper, we propose two problems related to fractal Brownian motion. First problem is simultaneous estimation of two parameters, Hurst exponent and the volatility, that describe this random process. Numerical tests for the simulated fBm provided an efficient method. Second problem is approximation of the increments of the observed time series by a power function by increments from the fractional Brownian motion. Approximation and estimation are shown on the example of real data, daily deposit interest rates.

Keywords: fractional Brownian motion, Gausssian processes, approximation, time series, estimation of properties of the model

Procedia PDF Downloads 354
19943 Upon One Smoothing Problem in Project Management

Authors: Dimitri Golenko-Ginzburg

Abstract:

A CPM network project with deterministic activity durations, in which activities require homogenous resources with fixed capacities, is considered. The problem is to determine the optimal schedule of starting times for all network activities within their maximal allowable limits (in order not to exceed the network's critical time) to minimize the maximum required resources for the project at any point in time. In case when a non-critical activity may start only at discrete moments with the pregiven time span, the problem becomes NP-complete and an optimal solution may be obtained via a look-over algorithm. For the case when a look-over requires much computational time an approximate algorithm is suggested. The algorithm's performance ratio, i.e., the relative accuracy error, is determined. Experimentation has been undertaken to verify the suggested algorithm.

Keywords: resource smoothing problem, CPM network, lookover algorithm, lexicographical order, approximate algorithm, accuracy estimate

Procedia PDF Downloads 286
19942 Toward a Characteristic Optimal Power Flow Model for Temporal Constraints

Authors: Zongjie Wang, Zhizhong Guo

Abstract:

While the regular optimal power flow model focuses on a single time scan, the optimization of power systems is typically intended for a time duration with respect to a desired objective function. In this paper, a temporal optimal power flow model for a time period is proposed. To reduce the computation burden needed for calculating temporal optimal power flow, a characteristic optimal power flow model is proposed, which employs different characteristic load patterns to represent the objective function and security constraints. A numerical method based on the interior point method is also proposed for solving the characteristic optimal power flow model. Both the temporal optimal power flow model and characteristic optimal power flow model can improve the systems’ desired objective function for the entire time period. Numerical studies are conducted on the IEEE 14 and 118-bus test systems to demonstrate the effectiveness of the proposed characteristic optimal power flow model.

Keywords: optimal power flow, time period, security, economy

Procedia PDF Downloads 431
19941 Application of Model Free Adaptive Control in Main Steam Temperature System of Thermal Power Plant

Authors: Khaing Yadana Swe, Lillie Dewan

Abstract:

At present, the cascade PID control is widely used to control the super-heating temperature (main steam temperature). As the main steam temperature has the characteristics of large inertia, large time-delay, and time varying, etc., conventional PID control strategy can not achieve good control performance. In order to overcome the bad performance and deficiencies of main steam temperature control system, Model Free Adaptive Control (MFAC) P cascade control system is proposed in this paper. By substituting MFAC in PID of the main control loop of the main steam temperature control, it can overcome time delays, non-linearity, disturbance and time variation.

Keywords: model-free adaptive control, cascade control, adaptive control, PID

Procedia PDF Downloads 591
19940 Low Frequency Ultrasonic Degassing to Reduce Void Formation in Epoxy Resin and Its Effect on the Thermo-Mechanical Properties of the Cured Polymer

Authors: A. J. Cobley, L. Krishnan

Abstract:

The demand for multi-functional lightweight materials in sectors such as automotive, aerospace, electronics is growing, and for this reason fibre-reinforced, epoxy polymer composites are being widely utilized. The fibre reinforcing material is mainly responsible for the strength and stiffness of the composites whilst the main role of the epoxy polymer matrix is to enhance the load distribution applied on the fibres as well as to protect the fibres from the effect of harmful environmental conditions. The superior properties of the fibre-reinforced composites are achieved by the best properties of both of the constituents. Although factors such as the chemical nature of the epoxy and how it is cured will have a strong influence on the properties of the epoxy matrix, the method of mixing and degassing of the resin can also have a significant impact. The production of a fibre-reinforced epoxy polymer composite will usually begin with the mixing of the epoxy pre-polymer with a hardener and accelerator. Mechanical methods of mixing are often employed for this stage but such processes naturally introduce air into the mixture, which, if it becomes entrapped, will lead to voids in the subsequent cured polymer. Therefore, degassing is normally utilised after mixing and this is often achieved by placing the epoxy resin mixture in a vacuum chamber. Although this is reasonably effective, it is another process stage and if a method of mixing could be found that, at the same time, degassed the resin mixture this would lead to shorter production times, more effective degassing and less voids in the final polymer. In this study the effect of four different methods for mixing and degassing of the pre-polymer with hardener and accelerator were investigated. The first two methods were manual stirring and magnetic stirring which were both followed by vacuum degassing. The other two techniques were ultrasonic mixing/degassing using a 40 kHz ultrasonic bath and a 20 kHz ultrasonic probe. The cured cast resin samples were examined under scanning electron microscope (SEM), optical microscope, and Image J analysis software to study morphological changes, void content and void distribution. Three point bending test and differential scanning calorimetry (DSC) were also performed to determine the thermal and mechanical properties of the cured resin. It was found that the use of the 20 kHz ultrasonic probe for mixing/degassing gave the lowest percentage voids of all the mixing methods in the study. In addition, the percentage voids found when employing a 40 kHz ultrasonic bath to mix/degas the epoxy polymer mixture was only slightly higher than when magnetic stirrer mixing followed by vacuum degassing was utilized. The effect of ultrasonic mixing/degassing on the thermal and mechanical properties of the cured resin will also be reported. The results suggest that low frequency ultrasound is an effective means of mixing/degassing a pre-polymer mixture and could enable a significant reduction in production times.

Keywords: degassing, low frequency ultrasound, polymer composites, voids

Procedia PDF Downloads 282
19939 Comparative Study on Inhibiting Factors of Cost and Time Control in Nigerian Construction Practice

Authors: S. Abdulkadir, I. Y. Moh’d, S. U. Kunya, U. Nuruddeen

Abstract:

The basis of any contract formation between the client and contractor is the budgeted cost and the estimated duration of projects. These variables are paramount important to project's sponsor in a construction projects and in assessing the success or viability of construction projects. Despite the availability of various techniques of cost and time control, many projects failed to achieve their initial estimated cost and time. The paper evaluate the inhibiting factors of cost and time control in Nigerian construction practice and comparing the result with the United Kingdom practice as identified by one researcher. The populations of the study are construction professionals within Bauchi and Gombe state, Nigeria, a judgmental sampling employed in determining the size of respondents. Descriptive statistics used in analyzing the data in SPSS. Design change, project fraud and corruption, financing and payment of completed work found to be common among the top five inhibiting factors of cost and time control in the study area. Furthermore, the result had shown some comprising with slight contrast as in the case of United Kingdom practice. Study recommend the adaptation of mitigation measures developed in the UK prior to assessing its effectiveness and so also developing a mitigating measure for other top factors that are not within the one developed in United Kingdom practice. Also, it recommends a wider assessing comparison on the modify inhibiting factors of cost and time control as revealed by the study to cover almost all part of Nigeria.

Keywords: comparison, cost, inhibiting factor, United Kingdom, time

Procedia PDF Downloads 422
19938 Effect of Compost Application on Uptake and Allocation of Heavy Metals and Plant Nutrients and Quality of Oriental Tobacco Krumovgrad 90

Authors: Violina R. Angelova, Venelina T. Popova, Radka V. Ivanova, Givko T. Ivanov, Krasimir I. Ivanov

Abstract:

A comparative research on the impact of compost on uptake and allocation of nutrients and heavy metals and quality of Oriental tobacco Krumovgrad 90 has been carried out. The experiment was performed on an agricultural field contaminated by the lead zinc smelter near the town of Kardzali, Bulgaria, after closing the lead production. The compost treatments had significant effects on the uptake and allocation of plant nutrients and heavy metals. The incorporation of compost leads to decrease in the amount of heavy metals present in the tobacco leaves, with Cd, Pb and Zn having values of 36%, 12% and 6%, respectively. Application of the compost leads to increased content of potassium, calcium and magnesium in the leaves of tobacco, and therefore, may favorably affect the burning properties of tobacco. The incorporation of compost in the soil has a negative impact on the quality and typicality of the oriental tobacco variety of Krumovgrad 90. The incorporation of compost leads to an increase in the size of the tobacco plant leaves, the leaves become darker in colour, less fleshy and undergo a change in form, becoming (much) broader in the second, third and fourth stalk position. This is accompanied by a decrease in the quality of the tobacco. The incorporation of compost also results in an increase in the mineral substances (pure ash), total nicotine and nitrogen, and a reduction in the amount of reducing sugars, which causes the quality of the tobacco leaves to deteriorate (particularly in the third and fourth harvests).

Keywords: chemical composition, compost, heavy metals, oriental tobacco, quality

Procedia PDF Downloads 250
19937 Humanising the Employment Environment for Emergency Medical Personnel: A Case Study of Capricorn District in Limpopo Province: South Africa

Authors: Manganyi Patricia Siphiwe

Abstract:

Work environments are characterised by performance pressure and mechanisation, which lead to job stress and the dehumanisation of work spaces. The personnel’s competence to accomplish job responsibilities and high job demands lead to a substantial load of health. Therefore, providing employees with conducive working environments is essential. In order to attain it, the employer should ensure that responsive and institutional safe systems are in place. The employer’s responses to employees’ needs are of significance to a healthy and developmental work environment. Denying employees a developmental and flourishing workplace is to deprive a workplace of being humane. Stressors coming from various aspects in the workplace can yield undue pressure and undesired responses for the workforces. Against the profiled background, this paper examines the causes and consequences of workplace stress within the Emergency Medical sector. The paper utilised a qualitative methodology and in-depth interviews for data collection with the purposively sampled emergency medical personnel. The findings showed that workplace stress has been associated with high demands and lack of support which has an adverse effect on biopsychosocial wellbeing of employees. This paper, therefore, recommends an engaged involvement of social workers through work organisational initiatives, such as Employee Assistance Programmes (EAP) and related labour relations policy activities to promote positive and developmental working environments.

Keywords: stress, employee, workplace, wellbeing

Procedia PDF Downloads 78
19936 Maximizing Customer Service through Logistics Service Support in the Automobile Industry in Ghana

Authors: John M. Frimpong, Matilda K. Owusu-Bio, Caleb Annan

Abstract:

Business today is highly competitive, and the automobile industry is no exception. Therefore, it is necessary to determine the customer value and service quality measures that lead to customer satisfaction which in turn lead to customer loyalty. However, in the automobile industry, the role of logistics service support in these relationships cannot be undermined. It could be inferred that logistics service supports and its management has a direct correlation with customer service and or service quality. But this is not always the same for all industries. Therefore, this study was to investigate how automobile companies implement the concept of customer service through logistics service supports. In order to ascertain this, two automobile companies in Ghana were selected, and these are Toyota Ghana Limited and Mechanical Lloyd Company Ltd. The study developed a conceptual model to depict the study’s objectives from which questionnaires were developed from for data collection. Respondents were made up of customers and staff of the two companies. The findings of the study revealed that the automobile industry partly attributes their customer satisfaction to the customer value, service quality or customer value. It shows a positive relationship between logistics service supports and service quality and customer value. However, the results indicate that customer satisfaction is not predicted by logistics services. This implies that in the automobile industry, it is not always the case that when customer service is implemented through logistics service supports, it leads to customer satisfaction. Therefore, there is the need for all players and stakeholders in the automobile industry investigate other factors which help to increase customer satisfaction in addition to logistics service supports. It is recommended that logistics service supports should be geared towards meeting customer expectations and not just based on the organization’s standards and procedures. It is necessary to listen to the voice of the customer to tailor the service package to suit the needs and expectations of the customer.

Keywords: customer loyalty, customer satisfaction, customer service, customer value, logistics service supports

Procedia PDF Downloads 473
19935 Influence of Roofing Material on Indoor Thermal Comfort of Bamboo House

Authors: Thet Su Hlaing, Shoichi Kojima

Abstract:

The growing desire for better indoor thermal performance with moderate energy consumption is becoming an issue for challenging today’s built environment. Studies related to the effective way of enhancing indoor thermal comfort had been done by approaching in numerous ways. Few studies have been focused on the correlation between building material and indoor thermal comfort of vernacular house. This paper analyzes the thermal comfort conditions of Bamboo House, mostly located in a hot and humid region. Depending on the roofing material, how the indoor environment varies will be observed through monitoring indoor and outdoor comfort measurement of Bamboo house as well as occupants’ preferable comfort condition. The result revealed that the indigenous roofing material mostly influences the indoor thermal environment by performing to have less effect from the outdoor temperature. It can keep the room cool with moderate thermal comfort, especially in the early morning and night, in the summertime without mechanical device assistance. After analyzing the performance of roofing material, which effect on indoor thermal comfort for 24 hours, it can be efficiently managed the time for availing mechanical cooling devices and make it supply only the necessary period of a day, which will lead to a partially reduce energy consumption.

Keywords: bamboo house, hot and humid climate, indoor thermal comfort, local indigenous roofing material

Procedia PDF Downloads 154
19934 Effect of Precursors Aging Time on the Photocatalytic Activity of Zno Thin Films

Authors: N. Kaneva, A. Bojinova, K. Papazova

Abstract:

Thin ZnO films are deposited on glass substrates via sol–gel method and dip-coating. The films are prepared from zinc acetate dehydrate as a starting reagent. After that the as-prepared ZnO sol is aged for different periods (0, 1, 3, 5, 10, 15, and 30 days). Nanocrystalline thin films are deposited from various sols. The effect ZnO sols aging time on the structural and photocatalytic properties of the films is studied. The films surface is studied by Scanning Electron Microscopy. The effect of the aging time of the starting solution is studied inrespect to photocatalytic degradation of Reactive Black 5 (RB5) by UV-vis spectroscopy. The experiments are conducted upon UV-light illumination and in complete darkness. The variation of the absorption spectra shows the degradation of RB5 dissolved in water, as a result of the reaction acurring on the surface of the films, and promoted by UV irradiation. The initial concentrations of dye (5, 10 and 20 ppm) and the effect of the aging time are varied during the experiments. The results show, that the increasing aging time of starting solution with respect to ZnO generally promotes photocatalytic activity. The thin films obtained from ZnO sol, which is aged 30 days have best photocatalytic degradation of the dye (97,22%) in comparison with the freshly prepared ones (65,92%). The samples and photocatalytic experimental results are reproducible. Nevertheless, all films exhibit a substantial activity in both UV light and darkness, which is promising for the development of new ZnO photocatalysts by sol-gel method.

Keywords: ZnO thin films, sol-gel, photocatalysis, aging time

Procedia PDF Downloads 364
19933 A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Authors: Xiaodong Li, Peng Gao, Chao-Jung Huang, Shiying Hao, Xuefeng B. Ling, Yongxia Han, Yaqi Zhang, Le Zheng, Chengyin Ye, Modi Liu, Minjie Xia, Changlin Fu, Bo Jin, Karl G. Sylvester, Eric Widen

Abstract:

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Keywords: cancer prediction, deep learning, electronic health records, pancreatic adenocarcinoma

Procedia PDF Downloads 134
19932 Spectroscopic Characterization Approach to Study Ablation Time on Zinc Oxide Nanoparticles Synthesis by Laser Ablation Technique

Authors: Suha I. Al-Nassar, K. M. Adel, F. Zainab

Abstract:

This work was devoted for producing ZnO nanoparticles by pulsed laser ablation (PLA) of Zn metal plate in the aqueous environment of cetyl trimethyl ammonium bromide (CTAB) using Q-Switched Nd:YAG pulsed laser with wavelength= 1064 nm, Rep. rate= 10 Hz, Pulse duration= 6 ns and laser energy 50 mJ. Solution of nanoparticles is found stable in the colloidal form for a long time. The effect of ablation time on the optical and structure of ZnO was studied is characterized by UV-visible absorption. UV-visible absorption spectrum has four peaks at 256, 259, 265, 322 nm for ablation time (5, 10, 15, and 20 sec) respectively, our results show that UV–vis spectra show a blue shift in the presence of CTAB with decrease the ablation time and blue shift indicated to get smaller size of nanoparticles. The blue shift in the absorption edge indicates the quantum confinement property of nanoparticles. Also, FTIR transmittance spectra of ZnO2 nanoparticles prepared in these states show a characteristic ZnO absorption at 435–445cm^−1.

Keywords: zinc oxide nanoparticles, CTAB solution, pulsed laser ablation technique, spectroscopic characterization

Procedia PDF Downloads 361