Search results for: multiple linear regression model
22722 Relation between Sensory Processing Patterns and Working Memory in Autistic Children
Authors: Abbas Nesayan
Abstract:
Background: In recent years, autism has been under consideration in public and research area. Autistic children have dysfunction in communication, socialization, repetitive and stereotyped behaviors. In addition, they clinically suffer from difficulty in attention, challenge with familiar behaviors and sensory processing problems. Several variables are linked to sensory processing problems in autism, one of these variables is working memory. Working memory is part of the executive function which provides the necessary ability to completing multiple stages tasks. Method: This study has categorized in correlational research methods. After determining of entry criteria, according to purposive sampling method, 50 children were selected. Dunn’s sensory profile school companion was used for assessment of sensory processing patterns; behavioral rating inventory of executive functions was used (BRIEF) for assessment of working memory. Pearson correlation coefficient and linear regression were used for data analyzing. Results: The results showed the significant relationship between sensory processing patterns (low registration, sensory seeking, sensory sensitivity and sensory avoiding) with working memory in autistic children. Conclusion: According to the findings, there is the significant relationship between the patterns of sensory processing and working memory. So, in order to improve the working memory could be used some interventions based on the sensory processing.Keywords: sensory processing patterns, working memory, autism, autistic children
Procedia PDF Downloads 22322721 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes
Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini
Abstract:
Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.Keywords: modelling, Monte Carlo simulations, probabilistic models, data clustering, reinforced concrete members, structural design
Procedia PDF Downloads 47222720 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques
Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas
Abstract:
The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining
Procedia PDF Downloads 12122719 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 10622718 Investigating the Effect of Study Plan and Homework on Student's Performance by Using Web Based Learning MyMathLab
Authors: Mohamed Chabi, Mahmoud I. Syam, Sarah Aw
Abstract:
In Summer 2012, the Foundation Program Unit of Qatar University has started implementing new ways of teaching Math by introducing MML (MyMathLab) as an innovative interactive tool to support standard teaching. In this paper, we focused on the effect of proper use of the Study Plan component of MML on student’s performance. Authors investigated the results of students of pre-calculus course during Fall 2013 in Foundation Program at Qatar University. The results showed that there is a strong correlation between study plan results and final exam results, also a strong relation between homework results and final exam results. In addition, the attendance average affected on the student’s results in general. Multiple regression is determined between passing rate dependent variable and study plan, homework as independent variable.Keywords: MyMathLab, study plan, assessment, homework, attendance, correlation, regression
Procedia PDF Downloads 41922717 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table
Authors: David A. Swanson, Lucky M. Tedrow
Abstract:
Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population
Procedia PDF Downloads 33022716 A Study of Industry 4.0 and Digital Transformation
Authors: Ibrahim Bashir, Yahaya Y. Yusuf
Abstract:
The ongoing shift towards Industry 4.0 represents a critical growth factor in the industrial enterprise, where the digital transformation of industries is increasingly seen as a crucial element for competitiveness. This transformation holds substantial potential, yet its full benefits have yet to be realized due to the fragmented approach to introducing Industry 4.0 technologies. Therefore, this pilot study aims to explore the individual and collective impact of Industry 4.0 technologies and digital transformation on organizational performance. Data were collected through a questionnaire-based survey across 51 companies in the manufacturing industry in the United Kingdom. The correlations and multiple linear regression analyses were conducted to assess the relationship and impact between the variables in the study. The results show that Industry 4.0 and digital transformation positively influence organizational performance and that Industry 4.0 technologies positively influence digital transformation. The results of this pilot study indicate that the implementation of Industry 4.0 technology is vital for increasing organizational performance; however, their roles differ largely. The differences are manifest in how the types of Industry 4.0 technologies correlate with how organizations integrate digital technologies into their operations. Hence, there is a clear indication of a strong correlation between Industry 4.0 technology, digital transformation, and organizational performance. Consequently, our study presents numerous pertinent implications that propel the theory of I4.0, digital business transformation (DBT), and organizational performance forward, as well as guide managers in the manufacturing sector.Keywords: industry 4.0 technologies, digital transformation, digital integration, organizational performance
Procedia PDF Downloads 14122715 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm
Authors: Zachary Huffman, Joana Rocha
Abstract:
Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations
Procedia PDF Downloads 13522714 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression
Authors: Keisuke Takahata, Hiroshi Suetsugu
Abstract:
Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification
Procedia PDF Downloads 18322713 Non-Linear Free Vibration Analysis of Laminated Composite Beams Resting on Non-Linear Pasternak Elastic Foundation: A Homogenization Procedure
Authors: Merrimi El Bekkaye, El Bikri Khalid, Benamar Rhali
Abstract:
In the present paper, the problem of geometrically non-linear free vibration of symmetrically and asymmetrically laminated composite beams (LCB) resting on nonlinear Pasternak elastic Foundation with immovable ends is studied. A homogenization procedure has been performed to reduce the problem under consideration to that of the isotropic homogeneous beams with effective bending stiffness and axial stiffness parameters. This simple formulation is developed using the governing axial equation of the beam in which the axial inertia and damping are ignored. The theoretical model is based on Hamilton’s principle and spectral analysis. Iterative form solutions are presented to calculate the fundamental nonlinear frequency parameters which are found to be in a good agreement with the published results. On the other hand, the influence of the foundation parameters on the nonlinear frequency to the linear frequency ratio of the LCB has been studied. The non-dimensional curvatures associated to the fundamental mode are also given in the case of clamped-clamped symmetrically and asymmetrically laminated composite beams.Keywords: large vibration amplitudes, laminated composite beam, Pasternak foundation, composite beams
Procedia PDF Downloads 53022712 Estimating Bridge Deterioration for Small Data Sets Using Regression and Markov Models
Authors: Yina F. Muñoz, Alexander Paz, Hanns De La Fuente-Mella, Joaquin V. Fariña, Guilherme M. Sales
Abstract:
The primary approach for estimating bridge deterioration uses Markov-chain models and regression analysis. Traditional Markov models have problems in estimating the required transition probabilities when a small sample size is used. Often, reliable bridge data have not been taken over large periods, thus large data sets may not be available. This study presents an important change to the traditional approach by using the Small Data Method to estimate transition probabilities. The results illustrate that the Small Data Method and traditional approach both provide similar estimates; however, the former method provides results that are more conservative. That is, Small Data Method provided slightly lower than expected bridge condition ratings compared with the traditional approach. Considering that bridges are critical infrastructures, the Small Data Method, which uses more information and provides more conservative estimates, may be more appropriate when the available sample size is small. In addition, regression analysis was used to calculate bridge deterioration. Condition ratings were determined for bridge groups, and the best regression model was selected for each group. The results obtained were very similar to those obtained when using Markov chains; however, it is desirable to use more data for better results.Keywords: concrete bridges, deterioration, Markov chains, probability matrix
Procedia PDF Downloads 33622711 The Effects of Aging on the Cost of Operating and Support: An Empirical Study Applied to Weapon Systems
Authors: Byungchae Kim, Jiwoo Nam
Abstract:
Aging of weapon systems can cause the failure and degeneration of components which results in increase of operating and support costs. However, whether this aging effect is significantly strong and it influences a lot on national defense spending due to the rapid increase in operating and support (O&S) costs is questionable. To figure out this, we conduct a literature review analyzing the aging effect of US weapon systems. We also conduct an empirical research using a maintenance database of Korean weapon systems, Defense Logistics Integrated Information System (DAIIS). We run regression of various types of O&S cost on weapon system age to investigate the statistical significance of aging effect and use generalized linear model to find relations between the failure of different priced components and the age. Our major finding is although aging effect exists, its impacts on weapon system cost seem to be not too large considering several characteristics of O&S cost elements not relying on the age.Keywords: O&S cost, aging effect, weapon system, GLM
Procedia PDF Downloads 14222710 Finite Element Modeling of Integral Abutment Bridge for Lateral Displacement
Authors: M. Naji, A. R. Khalim, M. Naji
Abstract:
Integral Abutment Bridges (IAB) are defined as simple or multiple span bridges in which the bridge deck is cast monolithically with the abutment walls. This kind of bridges are becoming very popular due to different aspects such as good response under seismic loading, low initial costs, elimination of bearings and less maintenance. However, the main issue related to the analysis of this type of structures is dealing with soil-structure interaction of the abutment walls and the supporting piles. A two-dimensional, non-linear finite element (FE) model of an integral abutment bridge has been developed to study the effect of lateral time history displacement loading on the soil system.Keywords: integral abutment bridge, soil structure interaction, finite element modeling, soil-pile interaction
Procedia PDF Downloads 28922709 The Intention to Use E-Money Transaction: The Moderating Effect of Security in Conceptual Frammework
Authors: Husnil Khatimah, Fairol Halim
Abstract:
This research examines the moderating impact of security on intention to use e-money that adapted from some variables of the TAM (Technology Acceptance Model) and TPB (Theory of Planned Behavior). This study will use security as moderating variable and finds these relationship depends on customer intention to use e-money as payment tools. The conceptual framework of e-money transactions was reviewed to understand behavioral intention of consumers from perceived usefulness, perceived ease of use, perceived behavioral control and security. Quantitative method will be utilized as sources of data collection. A total of one thousand respondents will be selected using quota sampling method in Medan, Indonesia. Descriptive analysis and Multiple Regression analysis will be conducted to analyze the data. The article ended with suggestion for future studies.Keywords: e-money transaction, TAM & TPB, moderating variable, behavioral intention, conceptual paper
Procedia PDF Downloads 45422708 Feature Selection for Production Schedule Optimization in Transition Mines
Authors: Angelina Anani, Ignacio Ortiz Flores, Haitao Li
Abstract:
The use of underground mining methods have increased significantly over the past decades. This increase has also been spared on by several mines transitioning from surface to underground mining. However, determining the transition depth can be a challenging task, especially when coupled with production schedule optimization. Several researchers have simplified the problem by excluding operational features relevant to production schedule optimization. Our research objective is to investigate the extent to which operational features of transition mines accounted for affect the optimal production schedule. We also provide a framework for factors to consider in production schedule optimization for transition mines. An integrated mixed-integer linear programming (MILP) model is developed that maximizes the NPV as a function of production schedule and transition depth. A case study is performed to validate the model, with a comparative sensitivity analysis to obtain operational insights.Keywords: underground mining, transition mines, mixed-integer linear programming, production schedule
Procedia PDF Downloads 16922707 Inventory Management System of Seasonal Raw Materials of Feeds at San Jose Batangas through Integer Linear Programming and VBA
Authors: Glenda Marie D. Balitaan
Abstract:
The branch of business management that deals with inventory planning and control is known as inventory management. It comprises keeping track of supply levels and forecasting demand, as well as scheduling when and how to plan. Keeping excess inventory results in a loss of money, takes up physical space, and raises the risk of damage, spoilage, and loss. On the other hand, too little inventory frequently causes operations to be disrupted and raises the possibility of low customer satisfaction, both of which can be detrimental to a company's reputation. The United Victorious Feed mill Corporation's present inventory management practices were assessed in terms of inventory level, warehouse allocation, ordering frequency, shelf life, and production requirement. To help the company achieve their optimal level of inventory, a mathematical model was created using Integer Linear Programming. Due to the season, the goal function was to reduce the cost of purchasing US Soya and Yellow Corn. Warehouse space, annual production requirements, and shelf life were all considered. To ensure that the user only uses one application to record all relevant information, like production output and delivery, the researcher built a Visual Basic system. Additionally, the technology allows management to change the model's parameters.Keywords: inventory management, integer linear programming, inventory management system, feed mill
Procedia PDF Downloads 8322706 How Social Support, Interaction with Clients and Work-Family Conflict Contribute to Mental Well-Being for Employees in the Human Service System
Authors: Uwe C. Fischer
Abstract:
Mental health and well-being for employees working in the human service system are getting more and more important given the increasing rate of absenteeism at work. Besides individual capacities, social and community factors seem to be important in the working setting. Starting from a demand resource framework including the classical demand control aspects, social support systems, specific demands and resources of the client work, and work-family conflict were considered in the present study. We state hypothetically, that these factors have a meaningful association with the mental quality of life of employees working in the field of social, educational and health sectors. 1140 employees, working in human service organizations (education, youth care, nursing etc.) were asked for strains and resources at work (selected scales from Salutogenetic Subjective Work Assessment SALSA and own new scales for client work), work-family conflict, and mental quality of life from the German Short Form Health Survey. Considering the complex influences of the variables, we conducted a multiple hierarchical regression analysis. One third of the whole variance of the mental quality of life can be declared by the different variables of the model. When the variables concerning social influences were included in the hierarchical regression, the influence of work related control resource decreased. Excessive workload, work-family conflict, social support by supervisors, co-workers and other persons outside work, as well as strains and resources associated with client work had significant regression coefficients. Conclusions: Social support systems are crucial in the social, educational and health related service sector, regarding the influence on mental well-being. Especially the work-family conflict focuses on the importance of the work-life balance. Also the specific strains and resources of the client work, measured with new constructed scales, showed great impact on mental health. Therefore occupational health promotion should focus more on the social factors within and outside the working place.Keywords: client interaction, human service system, mental health, social support, work-family conflict
Procedia PDF Downloads 43922705 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System
Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem
Abstract:
Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter
Procedia PDF Downloads 7622704 Food Security in Nigeria: An Examination of Food Availability and Accessibility in Nigeria
Authors: Okolo Chimaobi Valentine, Obidigbo Chizoba
Abstract:
As a basic physiology need, the threat to sufficient food production is the threat to human survival. Food security has been an issue that has gained global concern. This paper looks at the food security in Nigeria by assessing the availability of food and accessibility of the available food. The paper employed multiple linear regression technique and graphic trends of growth rates of relevant variables to show the situation of food security in Nigeria. Results of the tests revealed that population growth rate was higher than the growth rate of food availability in Nigeria for the earlier period of the study. Commercial bank credit to the agricultural sector, foreign exchange utilization for food and the Agricultural Credit Guarantee Scheme Fund (ACGSF) contributed significantly to food availability in Nigeria. Food prices grew at a faster rate than the average income level, making it difficult to access sufficient food. It implies that prior to the year 2012; there was insufficient food to feed the Nigerian populace. However, continued credit to the food and agricultural sector will ensure sustained and sufficient production of food in Nigeria. Microfinance banks should make sufficient credit available to the smallholder farmer. The government should further control and subsidize the rising price of food to make it more accessible by the people.Keywords: food, accessibility, availability, security
Procedia PDF Downloads 37622703 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 3222702 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions
Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer
Abstract:
The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping
Procedia PDF Downloads 21122701 Mapping Man-Induced Soil Degradation in Armenia's High Mountain Pastures through Remote Sensing Methods: A Case Study
Authors: A. Saghatelyan, Sh. Asmaryan, G. Tepanosyan, V. Muradyan
Abstract:
One of major concern to Armenia has been soil degradation emerged as a result of unsustainable management and use of grasslands, this in turn largely impacting environment, agriculture and finally human health. Hence, assessment of soil degradation is an essential and urgent objective set out to measure its possible consequences and develop a potential management strategy. Since recently, an essential tool for assessing pasture degradation has been remote sensing (RS) technologies. This research was done with an intention to measure preciseness of Linear spectral unmixing (LSU) and NDVI-SMA methods to estimate soil surface components related to degradation (fractional vegetation cover-FVC, bare soils fractions, surface rock cover) and determine appropriateness of these methods for mapping man-induced soil degradation in high mountain pastures. Taking into consideration a spatially complex and heterogeneous biogeophysical structure of the studied site, we used high resolution multispectral QuickBird imagery of a pasture site in one of Armenia’s rural communities - Nerkin Sasoonashen. The accuracy assessment was done by comparing between the land cover abundance data derived through RS methods and the ground truth land cover abundance data. A significant regression was established between ground truth FVC estimate and both NDVI-LSU and LSU - produced vegetation abundance data (R2=0.636, R2=0.625, respectively). For bare soil fractions linear regression produced a general coefficient of determination R2=0.708. Because of poor spectral resolution of the QuickBird imagery LSU failed with assessment of surface rock abundance (R2=0.015). It has been well documented by this particular research, that reduction in vegetation cover runs in parallel with increase in man-induced soil degradation, whereas in the absence of man-induced soil degradation a bare soil fraction does not exceed a certain level. The outcomes show that the proposed method of man-induced soil degradation assessment through FVC, bare soil fractions and field data adequately reflects the current status of soil degradation throughout the studied pasture site and may be employed as an alternate of more complicated models for soil degradation assessment.Keywords: Armenia, linear spectral unmixing, remote sensing, soil degradation
Procedia PDF Downloads 32822700 Impact of the Electricity Market Prices during the COVID-19 Pandemic on Energy Storage Operation
Authors: Marin Mandić, Elis Sutlović, Tonći Modrić, Luka Stanić
Abstract:
With the restructuring and deregulation of the power system, storage owners, generation companies or private producers can offer their multiple services on various power markets and earn income in different types of markets, such as the day-ahead, real-time, ancillary services market, etc. During the COVID-19 pandemic, electricity prices, as well as ancillary services prices, increased significantly. The optimization of the energy storage operation was performed using a suitable model for simulating the operation of a pumped storage hydropower plant under market conditions. The objective function maximizes the income earned through energy arbitration, regulation-up, regulation-down and spinning reserve services. The optimization technique used for solving the objective function is mixed integer linear programming (MILP). In numerical examples, the pumped storage hydropower plant operation has been optimized considering the already achieved hourly electricity market prices from Nord Pool for the pre-pandemic (2019) and the pandemic (2020 and 2021) years. The impact of the electricity market prices during the COVID-19 pandemic on energy storage operation is shown through the analysis of income, operating hours, reserved capacity and consumed energy for each service. The results indicate the role of energy storage during a significant fluctuation in electricity and services prices.Keywords: electrical market prices, electricity market, energy storage optimization, mixed integer linear programming (MILP) optimization
Procedia PDF Downloads 17522699 Modelling Sudden Deaths from Myocardial Infarction and Stroke
Authors: Y. S. Yusoff, G. Streftaris, H. R Waters
Abstract:
Death within 30 days is an important factor to be looked into, as there is a significant risk of deaths immediately following or soon after, Myocardial Infarction (MI) or stroke. In this paper, we will model the deaths within 30 days following a Myocardial Infarction (MI) or stroke in the UK. We will see how the probabilities of sudden deaths from MI or stroke have changed over the period 1981-2000. We will model the sudden deaths using a Generalized Linear Model (GLM), fitted using the R statistical package, under a Binomial distribution for the number of sudden deaths. We parameterize our model using the extensive and detailed data from the Framingham Heart Study, adjusted to match UK rates. The results show that there is a reduction for the sudden deaths following a MI over time but no significant improvement for sudden deaths following a stroke.Keywords: sudden deaths, myocardial infarction, stroke, ischemic heart disease
Procedia PDF Downloads 28722698 On Improving Breast Cancer Prediction Using GRNN-CP
Authors: Kefaya Qaddoum
Abstract:
The aim of this study is to predict breast cancer and to construct a supportive model that will stimulate a more reliable prediction as a factor that is fundamental for public health. In this study, we utilize general regression neural networks (GRNN) to replace the normal predictions with prediction periods to achieve a reasonable percentage of confidence. The mechanism employed here utilises a machine learning system called conformal prediction (CP), in order to assign consistent confidence measures to predictions, which are combined with GRNN. We apply the resulting algorithm to the problem of breast cancer diagnosis. The results show that the prediction constructed by this method is reasonable and could be useful in practice.Keywords: neural network, conformal prediction, cancer classification, regression
Procedia PDF Downloads 29122697 Winter Wheat Yield Forecasting Using Sentinel-2 Imagery at the Early Stages
Authors: Chunhua Liao, Jinfei Wang, Bo Shan, Yang Song, Yongjun He, Taifeng Dong
Abstract:
Winter wheat is one of the main crops in Canada. Forecasting of within-field variability of yield in winter wheat at the early stages is essential for precision farming. However, the crop yield modelling based on high spatial resolution satellite data is generally affected by the lack of continuous satellite observations, resulting in reducing the generalization ability of the models and increasing the difficulty of crop yield forecasting at the early stages. In this study, the correlations between Sentinel-2 data (vegetation indices and reflectance) and yield data collected by combine harvester were investigated and a generalized multivariate linear regression (MLR) model was built and tested with data acquired in different years. It was found that the four-band reflectance (blue, green, red, near-infrared) performed better than their vegetation indices (NDVI, EVI, WDRVI and OSAVI) in wheat yield prediction. The optimum phenological stage for wheat yield prediction with highest accuracy was at the growing stages from the end of the flowering to the beginning of the filling stage. The best MLR model was therefore built to predict wheat yield before harvest using Sentinel-2 data acquired at the end of the flowering stage. Further, to improve the ability of the yield prediction at the early stages, three simple unsupervised domain adaptation (DA) methods were adopted to transform the reflectance data at the early stages to the optimum phenological stage. The winter wheat yield prediction using multiple vegetation indices showed higher accuracy than using single vegetation index. The optimum stage for winter wheat yield forecasting varied with different fields when using vegetation indices, while it was consistent when using multispectral reflectance and the optimum stage for winter wheat yield prediction was at the end of flowering stage. The average testing RMSE of the MLR model at the end of the flowering stage was 604.48 kg/ha. Near the booting stage, the average testing RMSE of yield prediction using the best MLR was reduced to 799.18 kg/ha when applying the mean matching domain adaptation approach to transform the data to the target domain (at the end of the flowering) compared to that using the original data based on the models developed at the booting stage directly (“MLR at the early stage”) (RMSE =1140.64 kg/ha). This study demonstrated that the simple mean matching (MM) performed better than other DA methods and it was found that “DA then MLR at the optimum stage” performed better than “MLR directly at the early stages” for winter wheat yield forecasting at the early stages. The results indicated that the DA had a great potential in near real-time crop yield forecasting at the early stages. This study indicated that the simple domain adaptation methods had a great potential in crop yield prediction at the early stages using remote sensing data.Keywords: wheat yield prediction, domain adaptation, Sentinel-2, within-field scale
Procedia PDF Downloads 6422696 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum
Authors: Abdulrahman Sumayli, Saad M. AlShahrani
Abstract:
For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectivelyKeywords: temperature, pressure variations, machine learning, oil treatment
Procedia PDF Downloads 6922695 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics
Authors: Janne Engblom, Elias Oikarinen
Abstract:
A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.Keywords: dynamic model, fixed effects, panel data, price dynamics
Procedia PDF Downloads 150822694 Nonlinear Passive Shunt for Electroacoustic Absorbers Using Nonlinear Energy Sink
Authors: Diala Bitar, Emmanuel Gourdon, Claude H. Lamarque, Manuel Collet
Abstract:
Acoustic absorber devices play an important role reducing the noise at the propagation and reception paths. An electroacoustic absorber consists of a loudspeaker coupled to an electric shunt circuit, where the membrane is playing the role of an absorber/reflector of sound. Although the use of linear shunt resistors at the transducer terminals, has shown to improve the performances of the dynamical absorbers, it is nearly efficient in a narrow frequency band. Therefore, and since nonlinear phenomena are promising for their ability to absorb the vibrations and sound on a larger frequency range, we propose to couple a nonlinear electric shunt circuit at the loudspeaker terminals. Then, the equivalent model can be described by a 2 degrees of freedom system, consisting of a primary linear oscillator describing the dynamics of the loudspeaker membrane, linearly coupled to a cubic nonlinear energy sink (NES). The system is analytically treated for the case of 1:1 resonance, using an invariant manifold approach at different time scales. The proposed methodology enables us to detect the equilibrium points and fold singularities at the first slow time scales, providing a predictive tool to design the nonlinear circuit shunt during the energy exchange process. The preliminary results are promising; a significant improvement of acoustic absorption performances are obtained.Keywords: electroacoustic absorber, multiple-time-scale with small finite parameter, nonlinear energy sink, nonlinear passive shunt
Procedia PDF Downloads 22222693 Performance Prediction of a SANDIA 17-m Vertical Axis Wind Turbine Using Improved Double Multiple Streamtube
Authors: Abolfazl Hosseinkhani, Sepehr Sanaye
Abstract:
Different approaches have been used to predict the performance of the vertical axis wind turbines (VAWT), such as experimental, computational fluid dynamics (CFD), and analytical methods. Analytical methods, such as momentum models that use streamtubes, have low computational cost and sufficient accuracy. The double multiple streamtube (DMST) is one of the most commonly used of momentum models, which divide the rotor plane of VAWT into upwind and downwind. In fact, results from the DMST method have shown some discrepancy compared with experiment results; that is because the Darrieus turbine is a complex and aerodynamically unsteady configuration. In this study, analytical-experimental-based corrections, including dynamic stall, streamtube expansion, and finite blade length correction are used to improve the DMST method. Results indicated that using these corrections for a SANDIA 17-m VAWT will lead to improving the results of DMST.Keywords: vertical axis wind turbine, analytical, double multiple streamtube, streamtube expansion model, dynamic stall model, finite blade length correction
Procedia PDF Downloads 135