Search results for: conventional statistical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19963

Search results for: conventional statistical methods

19843 Confidence Intervals for Process Capability Indices for Autocorrelated Data

Authors: Jane A. Luke

Abstract:

Persistent pressure passed on to manufacturers from escalating consumer expectations and the ever growing global competitiveness have produced a rapidly increasing interest in the development of various manufacturing strategy models. Academic and industrial circles are taking keen interest in the field of manufacturing strategy. Many manufacturing strategies are currently centered on the traditional concepts of focused manufacturing capabilities such as quality, cost, dependability and innovation. Process capability indices was conducted assuming that the process under study is in statistical control and independent observations are generated over time. However, in practice, it is very common to come across processes which, due to their inherent natures, generate autocorrelated observations. The degree of autocorrelation affects the behavior of patterns on control charts. Even, small levels of autocorrelation between successive observations can have considerable effects on the statistical properties of conventional control charts. When observations are autocorrelated the classical control charts exhibit nonrandom patterns and lack of control. Many authors have considered the effect of autocorrelation on the performance of statistical process control charts. In this paper, the effect of autocorrelation on confidence intervals for different PCIs was included. Stationary Gaussian processes is explained. Effect of autocorrelation on PCIs is described in detail. Confidence intervals for Cp and Cpk are constructed for PCIs when data are both independent and autocorrelated. Confidence intervals for Cp and Cpk are computed. Approximate lower confidence limits for various Cpk are computed assuming AR(1) model for the data. Simulation studies and industrial examples are considered to demonstrate the results.

Keywords: autocorrelation, AR(1) model, Bissell’s approximation, confidence intervals, statistical process control, specification limits, stationary Gaussian processes

Procedia PDF Downloads 364
19842 The Effect of Bihemisferic Transcranial Direct Current Stimulation Therapy on Upper Extremity Motor Functions in Stroke Patients

Authors: Dilek Cetin Alisar, Oya Umit Yemisci, Selin Ozen, Seyhan Sozay

Abstract:

New approaches and treatment modalities are being developed to make patients more functional and independent in stroke rehabilitation. One of these approaches is transcranial direct stimulation therapy (tDCS), which aims to improve the hemiplegic upper limb function of stroke patients. tDCS therapy is not in the routine rehabilitation program; however, the studies about tDCS therapy on stroke rehabilitation was increased in recent years. Evaluate the effect of tDCS treatment on upper extremity motor function in patients with subacute stroke was aimed in our study. 32 stroke patients (16 tDCS group, 16 sham groups) who were hospitalized for rehabilitation in Başkent University Physical Medicine and Rehabilitation Clinic between 01.08.2016-20.01-2018 were included in the study. The conventional upper limb rehabilitation program was used for both tDCS and control group patients for 3 weeks, 5 days a week, for 60-120 minutes a day. In addition to the conventional stroke rehabilitation program in the tDAS group, bihemispheric tDCS was administered for 30 minutes daily. Patients were evaluated before treatment and after 1 week of treatment. Functional independence measure self-care score (FIM), Brunnstorm Recovery Stage (BRS), and Fugl-Meyer (FM) upper extremity motor function scale were used. There was no difference in demographic characteristics between the groups. There were no significant differences between BRS and FM scores in two groups, but there was a significant difference FIM score (p=0.05. FIM, BRS, and FM scores are significantly in the tDCS group, when before therapy and after 1 week of therapy, however, no difference is found in the shame group (p < 0,001). When FBS and FM scores were compared, there were statistical significant differences in tDCS group (p < 0,001). In conclusion, this randomized double-blind study showed that bihemispheric tDCS treatment was found to be superior to upper extremity motor and functional enhancement in addition to conventional rehabilitation methods in subacute stroke patients. In order for tDCS therapy to be used routinely in stroke rehabilitation, there is a need for more comprehensive, long-termed, randomized controlled clinical trials in order to find answers to many questions, such as the duration and intensity of treatment.

Keywords: cortical stimulation, motor function, rehabilitation, stroke

Procedia PDF Downloads 111
19841 Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops: Statistical Evaluation of the Potential Herbicide Savings

Authors: Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Henrik Skov Midtiby, Anders Krogh Mortensen, Sanmohan Baby

Abstract:

This work contributes a statistical model and simulation framework yielding the best estimate possible for the potential herbicide reduction when using the MoDiCoVi algorithm all the while requiring a efficacy comparable to conventional spraying. In June 2013 a maize field located in Denmark were seeded. The field was divided into parcels which was assigned to one of two main groups: 1) Control, consisting of subgroups of no spray and full dose spraty; 2) MoDiCoVi algorithm subdivided into five different leaf cover thresholds for spray activation. In addition approximately 25% of the parcels were seeded with additional weeds perpendicular to the maize rows. In total 299 parcels were randomly assigned with the 28 different treatment combinations. In the statistical analysis, bootstrapping was used for balancing the number of replicates. The achieved potential herbicide savings was found to be 70% to 95% depending on the initial weed coverage. However additional field trials covering more seasons and locations are needed to verify the generalisation of these results. There is a potential for further herbicide savings as the time interval between the first and second spraying session was not long enough for the weeds to turn yellow, instead they only stagnated in growth.

Keywords: herbicide reduction, macrosprayer, weed crop discrimination, site-specific, sprayer boom

Procedia PDF Downloads 278
19840 The State of Herb Medicine in Oriental Morocco: Cases of Debdou, Taourirt and Guerssif Districts

Authors: Himer Khalid, Alami Ilyass, Kharchoufa Loubna, Elachouri Mostafa

Abstract:

It has been estimated by the World Health Organization that 80% of the world's population relies on traditional medicine to meet their daily health requirements. In Morocco reliance on such medicine is partly owing to the high cost of conventional medicine and the inaccessibility of modern health care facilities. There was high agreement in the use of plants as medicine in Oriental Morocco. Our objective is to evaluate the informant’s knowledge on medicinal plants by the local population and to document the uses of medicinal plants by this community, for the treatment of different illnesses. Using an ethnopharmacological approach, we collected information concerning the traditional medicinal knowledge and the medicinal plants used, by interviewing successfully 458 informants living in oriental Morocco (from Debdou, Taourirt, Guersif a,d Laayoune districts). The data were analyzed by statistical methods (Component Analysis “CA”, Factorial Analysis “FA”) and other methods such as through Informant’s Consensus Factor (ICF) and Use Value (UV). Our results indicate that, more than 60% of the population in these regions relies on medicinal plants for the treatment of different ailments with predominance of women consumers. 135 plant species belonging to 61 families were documented. These plants were used by the population for the treatment of a group of illness (about 14 principal ailments). We conclude that, in oriental Morocco, till now, the population has some traditional knowledge commonly used as medical tradition. These wealthy heritage needs conservation and evaluation.

Keywords: Morocco, medicinal plants, traditional knowledge, wealthy heritage

Procedia PDF Downloads 259
19839 Drivers and Barriers of Asphalt Rubber in Sweden

Authors: Raheb Mirzanamadi, João Patrício

Abstract:

Asphalt rubber (AR) was initially developed in Sweden in the 1960s by replacing crumb rubber (CR) as aggregates in asphalt pavement. The AR produced by this method had better mechanical properties than conventional asphalt pavement but was very expensive. Since then, different technologies and methods have been developed to use CR in asphalt pavements, including blending CR with bitumen at a high temperature in the mixture, called the wet method, and blending CR with bitumen in the refinery, called the terminal blending method. In 2006, the wet method was imported from the USA to Sweden to evaluate the potential of using AR on Swedish roads. 154 km AR roads were constructed by the wet method in Sweden. The evaluation showed that the AR had, in most cases, better mechanical performance than conventional asphalt pavements. However, the terrible smoke and smell led the Swedish Transport Administration (STA) to stop using AR in Sweden. Today, there are few focuses on AR, despite its good mechanical properties and environmental aspects. Hence, there is a need to study the drives and barriers of using AR mixture in Sweden. The aims of this paper are: (i) to study drivers and barriers of using AR pavements in Sweden and (ii) to discover knowledge gaps for further research in this area. The study was done using a literature review and completed by interviews with experts, including three researchers from Swedish National Road and Transport Research Institute (VTI) and two experts from STA. The results showed that AR can be an alternative not only for conventional asphalt pavement but also for polymer modified asphalt (PMA) due to the same mechanical properties but the lower cost for production. New technologies such as terminal blending and using warm mix asphalt (WMA) methods can lead to reducing the energy and temperature during production processes. From this study, it is found that there is not enough experience and knowledge about AR in Sweden, and more research is needed, including the lifespan of AR, mechanical properties of AR using new technologies, and the impact of AR on spreading and leaching substances into nature. More studies can lead to standardization of using AR in Sweden, a potential solution for the use of end-of-life tyres, with better mechanical properties and lower costs, in comparison with conventional asphalt pavements and PMA.

Keywords: asphalt rubber, crumb rubber, terminal blending method, wet method

Procedia PDF Downloads 59
19838 Opto-Mechanical Characterization of Aspheric Lenses from the Hybrid Method

Authors: Aliouane Toufik, Hamdi Amine, Bouzid Djamel

Abstract:

Aspheric optical components are an alternative to the use of conventional lenses in the implementation of imaging systems for the visible range. Spherical lenses are capable of producing aberrations. Therefore, they are not able to focus all the light into a single point. Instead, aspherical lenses correct aberrations and provide better resolution even with compact lenses incorporating a small number of lenses. Metrology of these components is very difficult especially when the resolution requirements increase and insufficient or complexity of conventional tools requires the development of specific approaches to characterization. This work is part of the problem existed because the objectives are the study and comparison of different methods used to measure surface rays hybrid aspherical lenses.

Keywords: manufacture of lenses, aspherical surface, precision molding, radius of curvature, roughness

Procedia PDF Downloads 449
19837 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater

Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball

Abstract:

Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.

Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable

Procedia PDF Downloads 281
19836 A Brief Study about Nonparametric Adherence Tests

Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim

Abstract:

The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.

Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests

Procedia PDF Downloads 422
19835 Machine Learning Techniques in Seismic Risk Assessment of Structures

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.

Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine

Procedia PDF Downloads 84
19834 'Low Electronic Noise' Detector Technology in Computed Tomography

Authors: A. Ikhlef

Abstract:

Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.

Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector

Procedia PDF Downloads 102
19833 Effectiveness of Shock Wave Therapy Versus Intermittent Mechanical Traction on Mechanical Low Back Pain and Disabilities

Authors: Ahmed Assem Abd El Rahim

Abstract:

Background: Mechanical low back pain is serious physical and social health problem. Purpose: To examine impact of shock wave therapy versus intermittent mechanical traction on mechanical LBP, and disabilities. Subjects: 60 mechanical LBP male studied cases years old 20-35 years were assigned randomly into 3 groups, Picked up from Sohag university orthopedic hospital outpatient clinic. Methods: (Study Group) A: 20 studied cases underwent shock wave therapy plus conventional physical therapy. (Study Group) B: twenty studied cases underwent intermittent mechanical traction plus conventional physical therapy. (Control Group) C: 20 patients underwent conventional physical therapy alone. Three sessions were applied weekly for four weeks. Pain was quantified using McGill Pain Questionnaire, Roland Morris Disability Questionnaire was used for measuring disability, and the ROM was evaluated by (BROM) device pre- & post-therapy. Results: Groups (A, B & C) found a reduction in pain & disability & rise in their in flexion and extension ROM after end of 4 weeks of program. Mean values of pain scale after therapy were 15.3, 9.47, and 23.07 in groups A, B, & C. mean values of Disability scale after therapy were 8.44, 4.87, 11.8in groups A, B & C. mean values of ROM of flexion were 25.53, 29.06, & 23.9 in groups A, B & C. mean values of ROM of extension were 11.73, 15.53 & 9.85 in groups A, B & C. studied cases who received intermittent mechanical traction & conventional physical therapy (group B), found reduction in pain & disability & improvement in ROM of flexion & extension value (P<0.001) after therapy program. Conclusion: Shock wave therapy and intermittent mechanical traction, as well as conventional physical treatment, can be beneficial in studied cases with mechanical LBP.

Keywords: mechanical low back pain, shock wave, mechanical, low back pain

Procedia PDF Downloads 37
19832 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 308
19831 About the Effect of Temperature and Heating Rate on the Pyrolysis of Lignocellulosic Biomass Waste

Authors: María del Carmen Recio-Ruiz, Ramiro Ruiz-Rosas, Juana María Rosas, José Rodríguez-Mirasol, Tomás Cordero

Abstract:

At the present time, conventional fossil fuels show environmental and sustainability disadvantages with regard to renewables energies. Producing energy and chemicals from biomass is an interesting alternative for substitution of conventional fossil sources with a renewable feedstock while enabling zero net greenhouse gases emissions. Pyrolysis is a well-known process to produce fuels and chemicals from biomass. In this work, conventional and fast pyrolysis of different agro-industrial residues (almond shells, hemp hurds, olive stones, and Kraft lignin) was studied. Both processes were carried out in a fixed bed reactor under nitrogen flow and using different operating conditions to analyze the influence of temperature (400-800 ºC) and heating rate (10 and 20 ºC/minfor conventional pyrolysis and 50 ºC/s for fast pyrolysis)on the yields, products distribution, and composition of the different fractions. The results showed that for both conventional and fast pyrolysis, the solid fraction yield decreased with temperature, while the liquid and gas fractions increased. In the case of the fast pyrolysis, a higher content of liquid fraction than that obtained in conventional pyrolysis could be observed due to cracking reactions occur at a lesser extent. With respect to the composition of de non-condensable fraction, the main gases obtained were CO, CO₂ (mainly at low temperatures), CH₄, and H₂ (mainly at high temperatures).

Keywords: bio-oil, biomass, conventional pyrolysis, fast pyrolysis

Procedia PDF Downloads 163
19830 Production and Evaluation of Mango Pulp by Using Ohmic Heating Process

Authors: Sobhy M. Mohsen, Mohamed M. El-Nikeety, Tarek G. Mohamed, Michael Murkovic

Abstract:

The present work aimed to study the use of ohmic heating in the processing of mango pulp comparing to conventional method. Mango pulp was processed by using ohmic heating under the studied suitable conditions. Physical, chemical and microbiological properties of mango pulp were studied. The results showed that processing of mango pulp by using either ohmic heating or conventional method caused a decrease in the contents of TSS, total carbohydrates, total acidity, total sugars (reducing and non-reducing sugar) and an increase in phenol content, ascorbic acid and carotenoids compared to the conventional process. The increase in electric conductivity of mango pulp during ohmic heating was due to the addition of some electrolytes (salts) to increase the ions and enhance the process. The results also indicate that mango pulp processed by ohmic heating contained more phenols, carbohydrates and vitamin C and less HMF compared to that produced by conventional one. Total pectin and its fractions had slightly reduced by ohmic heating compared to conventional method. Enzymatic activities showed a reduction in poly phenoloxidase (PPO) and polygalacturonase (PG) activity in mango pulp processed by conventional method. However, ohmic heating completely inhibited PPO and PG activities.

Keywords: ohmic heating, mango pulp, phenolic, sarotenoids

Procedia PDF Downloads 436
19829 Assessment for the Backfill Using the Run of the Mine Tailings and Portland Cement

Authors: Javad Someehneshin, Weizhou Quan, Abdelsalam Abugharara, Stephen Butt

Abstract:

Narrow vein mining (NVM) is exploiting very thin but valuable ore bodies that are uneconomical to extract by conventional mining methods. NVM applies the technique of Sustainable Mining by Drilling (SMD). The SMD method is used to mine stranded, steeply dipping ore veins, which are too small or isolated to mine economically using conventional methods since the dilution is minimized. This novel mining technique uses drilling rigs to extract the ore through directional drilling surgically. This paper is focusing on utilizing the run of the mine tailings and Portland cement as backfill material to support the hanging wall for providing safe mine operation. Cemented paste backfill (CPB) is designed by mixing waste tailings, water, and cement of the precise percentage for optimal outcomes. It is a non-homogenous material that contains 70-85% solids. Usually, a hydraulic binder is added to the mixture to increase the strength of the CPB. The binder fraction mostly accounts for 2–10% of the total weight. In the mining industry, CPB has been improved and expanded gradually because it provides safety and support for the mines. Furthermore, CPB helps manage the waste tailings in an economical method and plays a significant role in environmental protection.

Keywords: backfilling, cement backfill, tailings, Portland cement

Procedia PDF Downloads 114
19828 Influence of Recombination of Free and Trapped Charge Carriers on the Efficiency of Conventional and Inverted Organic Solar Cells

Authors: Hooman Mehdizadeh Rad, Jai Singh

Abstract:

Organic solar cells (OSCs) have been actively investigated in the last two decades due to their several merits such as simple fabrication process, low-cost manufacturing, and lightweight. In this paper, using the optical transfer matrix method (OTMM) and solving the drift-diffusion equations processes of recombination are studied in inverted and conventional bulk heterojunction (BHJ) OSCs. Two types of recombination processes are investigated: 1) recombination of free charge carriers using the Langevin theory and 2) of trapped charge carriers in the tail states with exponential energy distribution. These recombination processes are incorporated in simulating the current- voltage characteristics of both conventional and inverted BHJ OSCs. The results of this simulation produces a higher power conversion efficiency in the inverted structure in comparison with conventional structure, which agrees well with the experimental results.

Keywords: conventional organic solar cells, exponential tail state recombination, inverted organic solar cells, Langevin recombination

Procedia PDF Downloads 164
19827 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 437
19826 Modification of ZnMgO NPs for Improving Device Performance of Quantum Dot Light-emitting Diodes

Authors: Juyon Lee, Myoungjin Park, Jonghoon Kim, Jaekook Ha, Chanhee Lee

Abstract:

We demonstrated a new positive aging methods of QLEDs devices that can apply in large size inkjet printing display. Conventional positive aging method using photo-curable resin remains unclear mechanism of the phenomenon and also there are many limitations to apply large size panels in commercial process. Through the photo acid generator (PAG) in ETL Ink, we achieved 90% of the efficiency of the conventional method and up to 1000h life time stability (T80). This techniques could be applied to next generation of QLEDs panels and also can prove the working mechanism of positive aging in QLED related to modification of ZnMgO NPs.

Keywords: quantum dots, QLED, printing, positive aging, ZnMgO NPs

Procedia PDF Downloads 124
19825 Enhancing Secondary School Mathematics Retention with Blended Learning: Integrating Concepts for Improved Understanding

Authors: Felix Oromena Egara, Moeketsi Mosia

Abstract:

The study aimed to evaluate the impact of blended learning on mathematics retention among secondary school students. Conducted in the Isoko North Local Government Area of Delta State, Nigeria, the research involved 1,235 senior class one (SS 1) students. Employing a non-equivalent control group pre-test-post-test quasi-experimental design, a sample of 70 students was selected from two secondary schools with ICT facilities through purposive sampling. Random allocation of students into experimental and control groups was achieved through balloting within each selected school. The investigation included three assessment points: pre-Mathematics Achievement Test (MAT), post-MAT, and post-post-MAT (retention), administered systematically by the researchers. Data collection utilized the established MAT instrument, which demonstrated a high reliability score of 0.86. Statistical analysis was conducted using the Statistical Package for Social Sciences (SPSS) version 28, with mean and standard deviation addressing study questions and analysis of covariance scrutinizing hypotheses at a significance level of .05. Results revealed significantly greater improvements in mathematics retention scores among students exposed to blended learning compared to those instructed through conventional methods. Moreover, noticeable differences in mean retention scores were observed, with male students in the blended learning group exhibiting notably higher performance. Based on these findings, recommendations were made, advocating for mathematics educators to integrate blended learning, particularly in geometry teaching, to enhance students’ retention of mathematical concepts.

Keywords: blended learning, flipped classroom model, secondary school students, station rotation model

Procedia PDF Downloads 15
19824 Anomaly Detection in Financial Markets Using Tucker Decomposition

Authors: Salma Krafessi

Abstract:

The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.

Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models

Procedia PDF Downloads 33
19823 Economic Design of a Quality Control Chart for the Proportion of Defective Items

Authors: Encarnación Álvarez-Verdejo, Raúl Amor-Pulido, Pablo J. Moya-Fernández, Juan F. Muñoz-Rosas, Francisco J. Blanco-Encomienda

Abstract:

Many companies use the statistical tool named as statistical quality control, and which can have a high cost for the companies interested on these statistical tools. The evaluation of the quality of products and services is an important topic, but the reduction of the cost of the implantation of the statistical quality control also has important benefits for the companies. For this reason, it is important to implement a economic design for the various steps included into the statistical quality control. In this paper, we describe some relevant aspects related to the economic design of a quality control chart for the proportion of defective items. They are very important because the suggested issues can reduce the cost of implementing a quality control chart for the proportion of defective items. Note that the main purpose of this chart is to evaluate and control the proportion of defective items of a production process.

Keywords: proportion, type I error, economic plan, distribution function

Procedia PDF Downloads 414
19822 Quantum Statistical Mechanical Formulations of Three-Body Problems via Non-Local Potentials

Authors: A. Maghari, V. M. Maleki

Abstract:

In this paper, we present a quantum statistical mechanical formulation from our recently analytical expressions for partial-wave transition matrix of a three-particle system. We report the quantum reactive cross sections for three-body scattering processes 1 + (2,3)-> 1 + (2,3) as well as recombination 1 + (2,3) -> 2 + (3,1) between one atom and a weakly-bound dimer. The analytical expressions of three-particle transition matrices and their corresponding cross-sections were obtained from the three-dimensional Faddeev equations subjected to the rank-two non-local separable potentials of the generalized Yamaguchi form. The equilibrium quantum statistical mechanical properties such partition function and equation of state as well as non-equilibrium quantum statistical properties such as transport cross-sections and their corresponding transport collision integrals were formulated analytically. This leads to obtain the transport properties, such as viscosity and diffusion coefficient of a moderate dense gas.

Keywords: statistical mechanics, nonlocal separable potential, three-body interaction, faddeev equations

Procedia PDF Downloads 383
19821 Resonant Fluorescence in a Two-Level Atom and the Terahertz Gap

Authors: Nikolai N. Bogolubov, Andrey V. Soldatov

Abstract:

Terahertz radiation occupies a range of frequencies somewhere from 100 GHz to approximately 10 THz, just between microwaves and infrared waves. This range of frequencies holds promise for many useful applications in experimental applied physics and technology. At the same time, reliable, simple techniques for generation, amplification, and modulation of electromagnetic radiation in this range are far from been developed enough to meet the requirements of its practical usage, especially in comparison to the level of technological abilities already achieved for other domains of the electromagnetic spectrum. This situation of relative underdevelopment of this potentially very important range of electromagnetic spectrum is known under the name of the 'terahertz gap.' Among other things, technological progress in the terahertz area has been impeded by the lack of compact, low energy consumption, easily controlled and continuously radiating terahertz radiation sources. Therefore, development of new techniques serving this purpose as well as various devices based on them is of obvious necessity. No doubt, it would be highly advantageous to employ the simplest of suitable physical systems as major critical components in these techniques and devices. The purpose of the present research was to show by means of conventional methods of non-equilibrium statistical mechanics and the theory of open quantum systems, that a thoroughly studied two-level quantum system, also known as an one-electron two-level 'atom', being driven by external classical monochromatic high-frequency (e.g. laser) field, can radiate continuously at much lower (e.g. terahertz) frequency in the fluorescent regime if the transition dipole moment operator of this 'atom' possesses permanent non-equal diagonal matrix elements. This assumption contradicts conventional assumption routinely made in quantum optics that only the non-diagonal matrix elements persist. The conventional assumption is pertinent to natural atoms and molecules and stems from the property of spatial inversion symmetry of their eigenstates. At the same time, such an assumption is justified no more in regard to artificially manufactured quantum systems of reduced dimensionality, such as, for example, quantum dots, which are often nicknamed 'artificial atoms' due to striking similarity of their optical properties to those ones of the real atoms. Possible ways to experimental observation and practical implementation of the predicted effect are discussed too.

Keywords: terahertz gap, two-level atom, resonant fluorescence, quantum dot, resonant fluorescence, two-level atom

Procedia PDF Downloads 245
19820 On Mathematical Modelling and Optimization of Emerging Trends Processes in Advanced Manufacturing

Authors: Agarana Michael C., Akinlabi Esther T., Pule Kholopane

Abstract:

Innovation in manufacturing process technologies and associated product design affects the prospects for manufacturing today and in near future. In this study some theoretical methods, useful as tools in advanced manufacturing, are considered. In particular, some basic Mathematical, Operational Research, Heuristic, and Statistical techniques are discussed. These techniques/methods are very handy in many areas of advanced manufacturing processes, including process planning optimization, modelling and analysis. Generally the production rate requires the application of Mathematical methods. The Emerging Trends Processes in Advanced Manufacturing can be enhanced by using Mathematical Modelling and Optimization techniques.

Keywords: mathematical modelling, optimization, emerging trends, advanced manufacturing

Procedia PDF Downloads 272
19819 Comparison of Microwave-Assisted and Conventional Leaching for Extraction of Copper from Chalcopyrite Concentrate

Authors: Ayfer Kilicarslan, Kubra Onol, Sercan Basit, Muhlis Nezihi Saridede

Abstract:

Chalcopyrite (CuFeS2) is the most common primary mineral used for the commercial production of copper. The low dissolution efficiency of chalcopyrite in sulfate media has prevented an efficient industrial leaching of this mineral in sulfate media. Ferric ions, bacteria, oxygen and other oxidants have been used as oxidizing agents in the leaching of chalcopyrite in sulfate and chloride media under atmospheric or pressure leaching conditions. Two leaching methods were studied to evaluate chalcopyrite (CuFeS2) dissolution in acid media. First, the conventional oxidative acid leaching method was carried out using sulfuric acid (H2SO4) and potassium dichromate (K2Cr2O7) as oxidant at atmospheric pressure. Second, microwave-assisted acid leaching was performed using the microwave accelerated reaction system (MARS) for same reaction media. Parameters affecting the copper extraction such as leaching time, leaching temperature, concentration of H2SO4 and concentration of K2Cr2O7 were investigated. The results of conventional acid leaching experiments were compared to the microwave leaching method. It was found that the copper extraction obtained under high temperature and high concentrations of oxidant with microwave leaching is higher than those obtained conventionally. 81% copper extraction was obtained by the conventional oxidative acid leaching method in 180 min, with the concentration of 0.3 mol/L K2Cr2O7 in 0.5M H2SO4 at 50 ºC, while 93.5% copper extraction was obtained in 60 min with microwave leaching method under same conditions.

Keywords: extraction, copper, microwave-assisted leaching, chalcopyrite, potassium dichromate

Procedia PDF Downloads 341
19818 Statistical Investigation Projects: A Way for Pre-Service Mathematics Teachers to Actively Solve a Campus Problem

Authors: Muhammet Şahal, Oğuz Köklü

Abstract:

As statistical thinking and problem-solving processes have become increasingly important, teachers need to be more rigorously prepared with statistical knowledge to teach their students effectively. This study examined preservice mathematics teachers' development of statistical investigation projects using data and exploratory data analysis tools, following a design-based research perspective and statistical investigation cycle. A total of 26 pre-service senior mathematics teachers from a public university in Turkiye participated in the study. They formed groups of 3-4 members voluntarily and worked on their statistical investigation projects for six weeks. The data sources were audio recordings of pre-service teachers' group discussions while working on their projects in class, whole-class video recordings, and each group’s weekly and final reports. As part of the study, we reviewed weekly reports, provided timely feedback specific to each group, and revised the following week's class work based on the groups’ needs and development in their project. We used content analysis to analyze groups’ audio and classroom video recordings. The participants encountered several difficulties, which included formulating a meaningful statistical question in the early phase of the investigation, securing the most suitable data collection strategy, and deciding on the data analysis method appropriate for their statistical questions. The data collection and organization processes were challenging for some groups and revealed the importance of comprehensive planning. Overall, preservice senior mathematics teachers were able to work on a statistical project that contained the formulation of a statistical question, planning, data collection, analysis, and reaching a conclusion holistically, even though they faced challenges because of their lack of experience. The study suggests that preservice senior mathematics teachers have the potential to apply statistical knowledge and techniques in a real-world context, and they could proceed with the project with the support of the researchers. We provided implications for the statistical education of teachers and future research.

Keywords: design-based study, pre-service mathematics teachers, statistical investigation projects, statistical model

Procedia PDF Downloads 53
19817 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: batteries, energy, iron, nickel, storage

Procedia PDF Downloads 416
19816 The Development of Statistical Analysis in Agriculture Experimental Design Using R

Authors: Somruay Apichatibutarapong, Chookiat Pudprommart

Abstract:

The purpose of this study was to develop of statistical analysis by using R programming via internet applied for agriculture experimental design. Data were collected from 65 items in completely randomized design, randomized block design, Latin square design, split plot design, factorial design and nested design. The quantitative approach was used to investigate the quality of learning media on statistical analysis by using R programming via Internet by six experts and the opinions of 100 students who interested in experimental design and applied statistics. It was revealed that the experts’ opinions were good in all contents except a usage of web board and the students’ opinions were good in overall and all items.

Keywords: experimental design, r programming, applied statistics, statistical analysis

Procedia PDF Downloads 340
19815 Discarding or Correcting Outlier Scores vs. Excluding Outlier Jurors to Reduce Manipulation in Classical Music Competitions.

Authors: Krzysztof Kontek, Kevin Kenner

Abstract:

This paper, written by an economist and pianist, aims to compare and analyze different methods of reducing manipulation in classical music competitions by focusing on outlier scores and outlier jurors. We first examine existing methods in competition practice and statistical literature for discarding or correcting jurors' scores that deviate significantly from the mean or median of all scores. We then introduce a method that involves eliminating all scores of outlier jurors, i.e., those jurors whose ratings significantly differ from those of other jurors. The properties of these standard and proposed methods are discussed in hypothetical voting scenarios, where one or more jurors assign scores that deviate considerably from the scores awarded by other jurors. Finally, we present examples of applying various methods to real-world data from piano competitions, demonstrating the potential effectiveness and implications of each approach in reducing manipulation within these events.

Keywords: voting systems, manipulation, outlier scores, outlier jurors

Procedia PDF Downloads 55
19814 Coordinated Voltage Control in a Radial Distribution System

Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat

Abstract:

Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.

Keywords: distributed generators, distributed system, reactive power, voltage control

Procedia PDF Downloads 474