Search results for: statistical modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7660

Search results for: statistical modeling

6640 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 432
6639 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad Daba, Jean-Pierre Dubois

Abstract:

Multi path fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper have utilized a Poisson modulated and weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multi-diversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent specular Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: cellular communication, femto and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process

Procedia PDF Downloads 448
6638 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field

Authors: Yana Snegireva

Abstract:

Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.

Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model

Procedia PDF Downloads 75
6637 Material Chemistry Level Deformation and Failure in Cementitious Materials

Authors: Ram V. Mohan, John Rivas-Murillo, Ahmed Mohamed, Wayne D. Hodo

Abstract:

Cementitious materials, an excellent example of highly complex, heterogeneous material systems, are cement-based systems that include cement paste, mortar, and concrete that are heavily used in civil infrastructure; though commonly used are one of the most complex in terms of the material morphology and structure than most materials, for example, crystalline metals. Processes and features occurring at the nanometer sized morphological structures affect the performance, deformation/failure behavior at larger length scales. In addition, cementitious materials undergo chemical and morphological changes gaining strength during the transient hydration process. Hydration in cement is a very complex process creating complex microstructures and the associated molecular structures that vary with hydration. A fundamental understanding can be gained through multi-scale level modeling for the behavior and properties of cementitious materials starting from the material chemistry level atomistic scale to further explore their role and the manifested effects at larger length and engineering scales. This predictive modeling enables the understanding, and studying the influence of material chemistry level changes and nanomaterial additives on the expected resultant material characteristics and deformation behavior. Atomistic-molecular dynamic level modeling is required to couple material science to engineering mechanics. Starting at the molecular level a comprehensive description of the material’s chemistry is required to understand the fundamental properties that govern behavior occurring across each relevant length scale. Material chemistry level models and molecular dynamics modeling and simulations are employed in our work to describe the molecular-level chemistry features of calcium-silicate-hydrate (CSH), one of the key hydrated constituents of cement paste, their associated deformation and failure. The molecular level atomic structure for CSH can be represented by Jennite mineral structure. Jennite has been widely accepted by researchers and is typically used to represent the molecular structure of the CSH gel formed during the hydration of cement clinkers. This paper will focus on our recent work on the shear and compressive deformation and failure behavior of CSH represented by Jennite mineral structure that has been widely accepted by researchers and is typically used to represent the molecular structure of CSH formed during the hydration of cement clinkers. The deformation and failure behavior under shear and compression loading deformation in traditional hydrated CSH; effect of material chemistry changes on the predicted stress-strain behavior, transition from linear to non-linear behavior and identify the on-set of failure based on material chemistry structures of CSH Jennite and changes in its chemistry structure will be discussed.

Keywords: cementitious materials, deformation, failure, material chemistry modeling

Procedia PDF Downloads 286
6636 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 94
6635 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance

Authors: Weisi Guo

Abstract:

It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.

Keywords: data analysis, empirical study, exams, marking

Procedia PDF Downloads 181
6634 Post-Earthquake Damage Detection Using System Identification with a Pair of Seismic Recordings

Authors: Lotfi O. Gargab, Ruichong R. Zhang

Abstract:

A wave-based framework is presented for modeling seismic motion in multistory buildings and using measured response for system identification which can be utilized to extract important information regarding structure integrity. With one pair of building response at two locations, a generalized model response is formulated based on wave propagation features and expressed as frequency and time response functions denoted, respectively, as GFRF and GIRF. In particular, GIRF is fundamental in tracking arrival times of impulsive wave motion initiated at response level which is dependent on local model properties. Matching model and measured-structure responses can help in identifying model parameters and infer building properties. To show the effectiveness of this approach, the Millikan Library in Pasadena, California is identified with recordings of the Yorba Linda earthquake of September 3, 2002.

Keywords: system identification, continuous-discrete mass modeling, damage detection, post-earthquake

Procedia PDF Downloads 369
6633 Relationship between Blow Count Number (N) and Shear Wave Velocity (Vs30) from the Specified Embankment Material: A Case Study on Three Selected Earthen Dams

Authors: Tanapon Suklim, Prachaya Intaphrom, Noppadol Poomvises, Anchalee Kongsuk

Abstract:

The relationship between shear wave velocity (Vs30) and blow count Number from Standard Penetration Tests (NSPT) was investigated on specified embankment dam to find the solution which can be used to estimate the value of N. Shear wave velocity, Vs30 and blow count number, NSPT were performed at three specified dam sites. At each site, Vs30 measurement was recorded by using seismic survey of MASW technique and NSPT were measured by field Standard Penetration Test. Regression analysis was used to derive statistical relation. The relation is giving a final solution to applicable calculated N-value with other earthen dam. Dam engineer can use the statistical relation to convert field Vs30 to estimated N-value instead of absolute N-value from field Standard Penetration Test. It can be noted that the formulae can be applied only in the earthen dam of specified material.

Keywords: blow count number, earthen dam, embankment, shear wave velocity

Procedia PDF Downloads 236
6632 A Statistical Analysis on Relationship between Temperature Variations with Latitude and Altitude regarding Total Amount of Atmospheric Carbon Dioxide in Iran

Authors: Masoumeh Moghbel

Abstract:

Nowadays, carbon dioxide which is produced by human activities is considered as the main effective factor in the global warming occurrence. Regarding to the role of CO2 and its ability in trapping the heat, the main objective of this research is study the effect of atmospheric CO2 (which is recorded in Manaloa) on variations of temperature parameters (daily mean temperature, minimum temperature and maximum temperature) in 5 meteorological stations in Iran which were selected according to the latitude and altitude in 40 years statistical period. Firstly, the trend of temperature parameters was studied by Regression and none-graphical Man-Kendal methods. Then, relation between temperature variations and CO2 were studied by Correlation technique. Also, the impact of CO2 amount on temperature in different atmospheric levels (850 and 500 hpa) was analyzed. The results illustrated that correlation coefficient between temperature variations and CO2 in low latitudes and high altitudes is more significant rather than other regions. it is important to note that altitude as the one of the main geographic factor has limitation in affecting the temperature variations, so that correlation coefficient between these two parameters in 850 hpa (r=0.86) is more significant than 500 hpa (r = 0.62).

Keywords: altitude, atmospheric carbon dioxide, latitude, temperature variations

Procedia PDF Downloads 408
6631 Frequency Distribution and Assertive Object Theory: An Exploration of the Late Bronze Age Italian Ceramic Landscape

Authors: Sara Fioretti

Abstract:

In the 2nd millennium BCE, maritime networks became essential to the Mediterranean lifestyle, creating an interconnected world. This phenomenon of interconnected cultures has often been misinterpreted as an “effect” of the Mycenaean “influence” without considering the complexity and role of regional and cross-cultural exchanges. This paper explores the socio-economic relationships, in both cross-cultural and potentially inter-regional settings, present within the archaeological repertoire of the southern Italian Late Bronze Age (LBA 1600 -1140 BCE). The emergence of economic relations within the connectivity of the regional settlements is explored through ceramic contexts found in the case studies Punta di Zambrone, Broglio di Trebisacce, and Nuraghe Antigori. This work-in-progress research is situated in the shifting theoretical views of the last ten years that discuss the Late Bronze Age’s connectivity through Social Networks, Entanglement, and Assertive Objects combined with a comparative statistical study of ceramic frequency distribution. Applying these theoretical frameworks with a quantitative approach demonstrates the specific regional economic relationships that shaped the cultural interactions of the Late Bronze Age. Through this intersection of theory and statistical analysis, the case studies establish a small percentage of pottery as imported, whilst assertive productions have a relatively higher quantity. Overall, the majority still adheres to regional Italian traditions. Therefore, we can dissect the rhizomatic relationships cultivated by the Italian coasts and Mycenaeans and their roles within their networks through the intersection of theoretical and statistical analysis. This research offers a new perspective on the connectivity of the Late Bronze Age relational structures.

Keywords: late bronze age, mediterranean archaeology, exchanges and trade, frequency distribution of ceramic assemblages

Procedia PDF Downloads 41
6630 Optimizing Performance of Tablet's Direct Compression Process Using Fuzzy Goal Programming

Authors: Abbas Al-Refaie

Abstract:

This paper aims at improving the performance of the tableting process using statistical quality control and fuzzy goal programming. The tableting process was studied. Statistical control tools were used to characterize the existing process for three critical responses including the averages of a tablet’s weight, hardness, and thickness. At initial process factor settings, the estimated process capability index values for the tablet’s averages of weight, hardness, and thickness were 0.58, 3.36, and 0.88, respectively. The L9 array was utilized to provide experimentation design. Fuzzy goal programming was then employed to find the combination of optimal factor settings. Optimization results showed that the process capability index values for a tablet’s averages of weight, hardness, and thickness were improved to 1.03, 4.42, and 1.42, respectively. Such improvements resulted in significant savings in quality and production costs.

Keywords: fuzzy goal programming, control charts, process capability, tablet optimization

Procedia PDF Downloads 270
6629 Creative Element Analysis of Machinery Creativity Contest Works

Authors: Chin-Pin, Chen, Shi-Chi, Shiao, Ting-Hao, Lin

Abstract:

Current industry is facing the rapid development of new technology in the world and fierce changes of economic environment in the society so that the industry development trend gradually does not focus on labor, but leads the industry and the academic circle with innovation and creativity. The development trend in machinery industry presents the same situation. Based on the aim of Creativity White Paper, Ministry of Education in Taiwan promotes and develops various creativity contests to cope with the industry trend. Domestic students and enterprises have good performance on domestic and international creativity contests in recent years. There must be important creative elements in such creative works to win the award among so many works. Literature review and in-depth interview with five creativity contest awarded instructors are first proceeded to conclude 15 machinery creative elements, which are further compared with the creative elements of machinery awarded creative works in past five years to understand the relationship between awarded works and creative elements. The statistical analysis results show that IDEA (Industrial Design Excellence Award) contains the most creative elements among four major international creativity contests. That is, most creativity review focuses on creative elements that are comparatively stricter. Concerning the groups participating in creativity contests, enterprises consider more creative elements of the creative works than other two elements for contests. From such contest works, creative elements of “replacement or improvement”, “convenience”, and “modeling” present higher significance. It is expected that the above findings could provide domestic colleges and universities with reference for participating in creativity related contests in the future.

Keywords: machinery, creative elements, creativity contest, creativity works

Procedia PDF Downloads 442
6628 Spirometric Reference Values in 236,606 Healthy, Non-Smoking Chinese Aged 4–90 Years

Authors: Jiashu Shen

Abstract:

Objectives: Spirometry is a basic reference for health evaluation which is widely used in clinical. Previous reference of spirometry is not applicable because of drastic changes of social and natural circumstance in China. A new reference values for the spirometry of the Chinese population is extremely needed. Method: Spirometric reference value was established using the statistical modeling method Generalized Additive Models for Location, Scale and Shape for forced expiratory volume in 1 s (FEV1), forced vital capacity (FVC), FEV1/FVC, and maximal mid-expiratory flow (MMEF). Results: Data from 236,606 healthy non-smokers aged 4–90 years was collected from the MJ Health Check database. Spirometry equations for FEV1, FVC, MMEF, and FEV1/FVC were established, including the predicted values and lower limits of normal (LLNs) by sex. The predictive equations that were developed for the spirometric results elaborated the relationship between spirometry and age, and they eliminated the effects of height as a variable. Most previous predictive equations for Chinese spirometry were significantly overestimated (to be exact, with mean differences of 22.21% in FEV1 and 31.39% in FVC for males, along with differences of 26.93% in FEV1 and 35.76% in FVC for females) or underestimated (with mean differences of -5.81% in MMEF and -14.56% in FEV1/FVC for males, along with a difference of -14.54% in FEV1/FVC for females) the results of lung function measurements as found in this study. Through cross-validation, our equations were established as having good fit, and the means of the measured value and the estimated value were compared, with good results. Conclusions: Our study updates the spirometric reference equations for Chinese people of all ages and provides comprehensive values for both physical examination and clinical diagnosis.

Keywords: Chinese, GAMLSS model, reference values, spirometry

Procedia PDF Downloads 136
6627 Numerical Study of the Influence of the Primary Stream Pressure on the Performance of the Ejector Refrigeration System Based on Heat Exchanger Modeling

Authors: Elhameh Narimani, Mikhail Sorin, Philippe Micheau, Hakim Nesreddine

Abstract:

Numerical models of the heat exchangers in ejector refrigeration system (ERS) were developed and validated with the experimental data. The models were based on the switched heat exchangers model using the moving boundary method, which were capable of estimating the zones’ lengths, the outlet temperatures of both sides and the heat loads at various experimental points. The developed models were utilized to investigate the influence of the primary flow pressure on the performance of an R245fa ERS based on its coefficient of performance (COP) and exergy efficiency. It was illustrated numerically and proved experimentally that increasing the primary flow pressure slightly reduces the COP while the exergy efficiency goes through a maximum before decreasing.

Keywords: Coefficient of Performance, COP, Ejector Refrigeration System, ERS, exergy efficiency (ηII), heat exchangers modeling, moving boundary method

Procedia PDF Downloads 202
6626 PWM Based Control of Dstatcom for Voltage Sag, Swell Mitigation in Distribution Systems

Authors: A. Assif

Abstract:

This paper presents the modeling of a prototype distribution static compensator (D-STATCOM) for voltage sag and swell mitigation in an unbalanced distribution system. Here the concept that an inverter can be used as generalized impedance converter to realize either inductive or capacitive reactance has been used to mitigate power quality issues of distribution networks. The D-STATCOM is here supposed to replace the widely used StaticVar Compensator (SVC). The scheme is based on the Voltage Source Converter (VSC) principle. In this model PWM based control scheme has been implemented to control the electronic valves of VSC. Phase shift control Algorithm method is used for converter control. The D-STATCOM injects a current into the system to mitigate the voltage sags. In this paper the modeling of D¬STATCOM has been designed using MATLAB SIMULINIC. Accordingly, simulations are first carried out to illustrate the use of D-STATCOM in mitigating voltage sag in a distribution system. Simulation results prove that the D-STATCOM is capable of mitigating voltage sag as well as improving power quality of a system.

Keywords: D-STATCOM, voltage sag, voltage source converter (VSC), phase shift control

Procedia PDF Downloads 343
6625 The Statistical Significant of Adsorbents for Effective Zn(II) Ions Removal

Authors: Kiurski S. Jelena, Oros B. Ivana, Kecić S. Vesna, Kovačević M. Ilija, Aksentijević M. Snežana

Abstract:

The adsorption efficiency of various adsorbents for the removal of Zn(II) ions from the waste printing developer was studied in laboratory batch mode. The maximum adsorption efficiency of 94.1% was achieved with unfired clay pellets size (d≈15 mm). The obtained values of adsorption efficiency was subjected to the independent samples t-test in order to investigate the statistically significant differences of the investigated adsorbents for the effective removal of Zn(II) ions from the waste printing developer. The most statistically significant differences of adsorption efficiencies for Zn(II) ions removal were obtained between unfired clay pellets size (d≈15 mm) and activated carbon (|t|= 6.909), natural zeolite (|t|= 10.380), mixture of activated carbon and natural zeolite (|t|= 9.865), bentonite (|t|= 6.159), fired clay (|t|= 6.641), fired clay pellets size (d≈5 mm) (|t|= 6.678), fired clay pellets size (d≈8 mm) (|t|= 3.422), respectively.

Keywords: Adsorption efficiency, adsorbent, statistical analysis, zinc ion.

Procedia PDF Downloads 389
6624 Instant Fire Risk Assessment Using Artifical Neural Networks

Authors: Tolga Barisik, Ali Fuat Guneri, K. Dastan

Abstract:

Major industrial facilities have a high potential for fire risk. In particular, the indices used for the detection of hidden fire are used very effectively in order to prevent the fire from becoming dangerous in the initial stage. These indices provide the opportunity to prevent or intervene early by determining the stage of the fire, the potential for hazard, and the type of the combustion agent with the percentage values of the ambient air components. In this system, artificial neural network will be modeled with the input data determined using the Levenberg-Marquardt algorithm, which is a multi-layer sensor (CAA) (teacher-learning) type, before modeling the modeling methods in the literature. The actual values produced by the indices will be compared with the outputs produced by the network. Using the neural network and the curves to be created from the resulting values, the feasibility of performance determination will be investigated.

Keywords: artifical neural networks, fire, Graham Index, levenberg-marquardt algoritm, oxygen decrease percentage index, risk assessment, Trickett Index

Procedia PDF Downloads 137
6623 Non-Linear Assessment of Chromatographic Lipophilicity of Selected Steroid Derivatives

Authors: Milica Karadžić, Lidija Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Kovačević, Anamarija Mandić, Aleksandar Oklješa, Andrea Nikolić, Marija Sakač, Katarina Penov Gaši

Abstract:

Using chemometric approach, the relationships between the chromatographic lipophilicity and in silico molecular descriptors for twenty-nine selected steroid derivatives were studied. The chromatographic lipophilicity was predicted using artificial neural networks (ANNs) method. The most important in silico molecular descriptors were selected applying stepwise selection (SS) paired with partial least squares (PLS) method. Molecular descriptors with satisfactory variable importance in projection (VIP) values were selected for ANN modeling. The usefulness of generated models was confirmed by detailed statistical validation. High agreement between experimental and predicted values indicated that obtained models have good quality and high predictive ability. Global sensitivity analysis (GSA) confirmed the importance of each molecular descriptor used as an input variable. High-quality networks indicate a strong non-linear relationship between chromatographic lipophilicity and used in silico molecular descriptors. Applying selected molecular descriptors and generated ANNs the good prediction of chromatographic lipophilicity of the studied steroid derivatives can be obtained. This article is based upon work from COST Actions (CM1306 and CA15222), supported by COST (European Cooperation and Science and Technology).

Keywords: artificial neural networks, chemometrics, global sensitivity analysis, liquid chromatography, steroids

Procedia PDF Downloads 345
6622 Relationship Between Reading Comprehension and Achievement in Science Among Grade Eleven Bilingual Students in a Secondary School, Thailand

Authors: Simon Mauma Efange

Abstract:

The main aims of this research were to describe, in co-relational terms, the relationship, if any, between reading comprehension and academic achievement in science studied at the secondary level and, secondly, to find out possible trends in gender differences, such as whether boys would perform better than girls or vice versa. This research employed a quantitative design. Two kinds of instruments were employed: the Oxford Online Placement Test and the Local Assessment System Test. The Oxford Online Placement Test assesses students' English level quickly and easily. The results of these tests were subjected to statistical analysis using a special statistical software called SPSS. Statistical tools such as mean, standard deviation, percentages, frequencies, t-tests, and Pearson’s coefficient of correlation were used for the analysis of the results. Results of the t-test showed that the means are significantly different. Calculating the p-value revealed that the results were extremely statistically significant at p <.05. The value of r (Pearson correlation coefficient) was 0.2868. Although technically there is a positive correlation, the relationship between the variables is only weak (the closer the value is to zero, the weaker the relationship). However, in conclusion, calculations from the t-test using SPSS revealed that the results were statistically significant at p <.05, confirming a relationship between the two variables, and high scores in reading will give rise to slightly high scores in science. The research also revealed that having a high score in reading comprehension doesn’t necessarily mean having a high score in science or vice versa. Female subjects performed much better than male subjects in both tests, which is in line with the literature reviewed for this research.

Keywords: achievement in science, achievement in English, and bilingual students, relationship

Procedia PDF Downloads 48
6621 Energy Consumption Modeling for Strawberry Greenhouse Crop by Adaptive Nero Fuzzy Inference System Technique: A Case Study in Iran

Authors: Azar Khodabakhshi, Elham Bolandnazar

Abstract:

Agriculture as the most important food manufacturing sector is not only the energy consumer, but also is known as energy supplier. Using energy is considered as a helpful parameter for analyzing and evaluating the agricultural sustainability. In this study, the pattern of energy consumption of strawberry greenhouses of Jiroft in Kerman province of Iran was surveyed. The total input energy required in the strawberries production was calculated as 113314.71 MJ /ha. Electricity with 38.34% contribution of the total energy was considered as the most energy consumer in strawberry production. In this study, Neuro Fuzzy networks was used for function modeling in the production of strawberries. Results showed that the best model for predicting the strawberries function had a correlation coefficient, root mean square error (RMSE) and mean absolute percentage error (MAPE) equal to 0.9849, 0.0154 kg/ha and 0.11% respectively. Regards to these results, it can be said that Neuro Fuzzy method can be well predicted and modeled the strawberry crop function.

Keywords: crop yield, energy, neuro-fuzzy method, strawberry

Procedia PDF Downloads 381
6620 Modeling of Oxygen Supply Profiles in Stirred-Tank Aggregated Stem Cells Cultivation Process

Authors: Vytautas Galvanauskas, Vykantas Grincas, Rimvydas Simutis

Abstract:

This paper investigates a possible practical solution for reasonable oxygen supply during the pluripotent stem cells expansion processes, where the stem cells propagate as aggregates in stirred-suspension bioreactors. Low glucose and low oxygen concentrations are preferred for efficient proliferation of pluripotent stem cells. However, strong oxygen limitation, especially inside of cell aggregates, can lead to cell starvation and death. In this research, the oxygen concentration profile inside of stem cell aggregates in a stem cell expansion process was predicted using a modified oxygen diffusion model. This profile can be realized during the stem cells cultivation process by manipulating the oxygen concentration in inlet gas or inlet gas flow. The proposed approach is relatively simple and may be attractive for installation in a real pluripotent stem cell expansion processes.

Keywords: aggregated stem cells, dissolved oxygen profiles, modeling, stirred-tank, 3D expansion

Procedia PDF Downloads 305
6619 Solid-Liquid-Solid Interface of Yakam Matrix: Mathematical Modeling of the Contact Between an Aircraft Landing Gear and a Wet Pavement

Authors: Trudon Kabangu Mpinga, Ruth Mutala, Shaloom Mbambu, Yvette Kalubi Kashama, Kabeya Mukeba Yakasham

Abstract:

A mathematical model is developed to describe the contact dynamics between the landing gear wheels of an aircraft and a wet pavement during landing. The model is based on nonlinear partial differential equations, using the Yakam Matrix to account for the interaction between solid, liquid, and solid phases. This framework incorporates the influence of environmental factors, particularly water or rain on the runway, on braking performance and aircraft stability. Given the absence of exact analytical solutions, our approach enhances the understanding of key physical phenomena, including Coulomb friction forces, hydrodynamic effects, and the deformation of the pavement under the aircraft's load. Additionally, the dynamics of aquaplaning are simulated numerically to estimate the braking performance limits on wet surfaces, thereby contributing to strategies aimed at minimizing risk during landing on wet runways.

Keywords: aircraft, modeling, simulation, yakam matrix, contact, wet runway

Procedia PDF Downloads 8
6618 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm

Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar

Abstract:

The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations

Procedia PDF Downloads 415
6617 Physical Modeling of Woodwind Ancient Greek Musical Instruments: The Case of Plagiaulos

Authors: Dimitra Marini, Konstantinos Bakogiannis, Spyros Polychronopoulos, Georgios Kouroupetroglou

Abstract:

Archaemusicology cannot entirely depend on the study of the excavated ancient musical instruments as most of the time their condition is not ideal (i.e., missing/eroded parts) and moreover, because of the concern damaging the originals during the experiments. Researchers, in order to overcome the above obstacles, build replicas. This technique is still the most popular one, although it is rather expensive and time-consuming. Throughout the last decades, the development of physical modeling techniques has provided tools that enable the study of musical instruments through their digitally simulated models. This is not only a more cost and time-efficient technique but also provides additional flexibility as the user can easily modify parameters such as their geometrical features and materials. This paper thoroughly describes the steps to create a physical model of a woodwind ancient Greek instrument, Plagiaulos. This instrument could be considered as the ancestor of the modern flute due to the common geometry and air-jet excitation mechanism. Plagiaulos is comprised of a single resonator with an open end and a number of tone holes. The combination of closed and open tone holes produces the pitch variations. In this work, the effects of all the instrument’s components are described by means of physics and then simulated based on digital waveguides. The synthesized sound of the proposed model complies with the theory, highlighting its validity. Further, the synthesized sound of the model simulating the Plagiaulos of Koile (2nd century BCE) was compared with its replica build in our laboratory by following the scientific methodologies of archeomusicology. The aforementioned results verify that robust dynamic digital tools can be introduced in the field of computational, experimental archaemusicology.

Keywords: archaeomusicology, digital waveguides, musical acoustics, physical modeling

Procedia PDF Downloads 113
6616 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 94
6615 Assessing the Effectiveness of Machine Learning Algorithms for Cyber Threat Intelligence Discovery from the Darknet

Authors: Azene Zenebe

Abstract:

Deep learning is a subset of machine learning which incorporates techniques for the construction of artificial neural networks and found to be useful for modeling complex problems with large dataset. Deep learning requires a very high power computational and longer time for training. By aggregating computing power, high performance computer (HPC) has emerged as an approach to resolving advanced problems and performing data-driven research activities. Cyber threat intelligence (CIT) is actionable information or insight an organization or individual uses to understand the threats that have, will, or are currently targeting the organization. Results of review of literature will be presented along with results of experimental study that compares the performance of tree-based and function-base machine learning including deep learning algorithms using secondary dataset collected from darknet.

Keywords: deep-learning, cyber security, cyber threat modeling, tree-based machine learning, function-based machine learning, data science

Procedia PDF Downloads 154
6614 Application of Causal Inference and Discovery in Curriculum Evaluation and Continuous Improvement

Authors: Lunliang Zhong, Bin Duan

Abstract:

The undergraduate graduation project is a vital part of the higher education curriculum, crucial for engineering accreditation. Current evaluations often summarize data without identifying underlying issues. This study applies the Peter-Clark algorithm to analyze causal relationships within the graduation project data of an Electronics and Information Engineering program, creating a causal model. Structural equation modeling confirmed the model's validity. The analysis reveals key teaching stages affecting project success, uncovering problems in the process. Introducing causal discovery and inference into project evaluation helps identify issues and propose targeted improvement measures. The effectiveness of these measures is validated by comparing the learning outcomes of two student cohorts, stratified by confounding factors, leading to improved teaching quality.

Keywords: causal discovery, causal inference, continuous improvement, Peter-Clark algorithm, structural equation modeling

Procedia PDF Downloads 18
6613 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 205
6612 Social Networks And Social Complexity: The Southern Italian Drive For Trade Exchange During The Late Bronze Age

Authors: Sara Fioretti

Abstract:

During the Middle Bronze Age, southern Italy underwent a reorganisation of social structures where local cultures, such as the sub-Apennine and Nuragic, flourished and participated in maritime trade. This paper explores the socio-economic relationships, in both cross-cultural and potentially inter-regional settings, present within the archaeological repertoire of the southern Italian Late Bronze Age (LBA 1600 -1050 BCE). The emergence of economic relations within the connectivity of the regional settlements is explored through ceramic contexts found in the case studies Punta di Zambrone, Broglio di Trebisacce, and Nuraghe Antigori. This paper discusses the findings of a statistical and theoretical approach from an ongoing study in relation to the Mediterranean’s characterisation as a period dominated by Mycenaean influence. This study engages with a theoretical bricolage of Social Networks Entanglement, and Assertive Objects Theory to address the selective and assertive dynamics evident in the cross-cultural trade exchanges as well as consider inter-regional dynamics. Through this intersection of theory and statistical analysis, the case studies establish a small percentage of pottery as imported, whilst assertive productions have a relatively higher quantity. Overall, the majority still adheres to regional Italian traditions. Therefore, we can dissect the rhizomatic relationships cultivated by the Italian coasts and Mycenaeans and their roles within their networks through the intersection of theoretical and statistical analysis. This research offers a new perspective on the complex nature of the Late Bronze Age relational structures.

Keywords: late bronze age, mediterranean archaeology, exchanges and trade, frequency distribution of ceramic assemblages, social network theory, rhizomatic exchanges

Procedia PDF Downloads 47
6611 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 143