Search results for: modeling framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8570

Search results for: modeling framework

7340 Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory

Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad

Abstract:

Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives.

Keywords: GAN, long short-term memory, synthetic data generation, traffic management

Procedia PDF Downloads 5
7339 Ecology in Politics: A Multimodal Eco-Critical Analysis of Environmental Discourse

Authors: Amany ElShazly, Lubna A. Sherif

Abstract:

The entanglement of humans with the environment has always been inevitable and often causes destruction. In this respect, ‘Ecolinguistics’ helps humans to understand the link between languages and the environment. Stibbe (2014a) has indicated that ‘linguistics’, particularly, Critical Discourse Studies (CDS), provides an interpretation of language which shapes world views, while the ‘eco’ side maintains the life-sustaining interactions of humans and the physical environment. This paper considers two key ecological instances, namely: The Grand Ethiopian Renaissance Dam (GERD) as a focal point of political dispute and THE LINE project as well as Etthadar lel Akhdar (Go Green Initiative) as two examples of combating ecological degradation. ‘Ecosophy’ as explained by Naess (1996) is used to describe the ecolinguistic framework, which assesses discourse where the linguistic lens focuses on the use of metaphor, and ‘Positive Discourse’ framework, which resonates with respect and care for the natural world.

Keywords: ecosophy, critical discourse studies, metaphor, positive discourse, social semiotics, ecolinguistics

Procedia PDF Downloads 88
7338 Assessing the Resilience to Economic Shocks of the Households in Bistekville 2, Quezon City, Philippines

Authors: Maria Elisa B. Manuel

Abstract:

The Philippine housing sector is bracing challenges with the massive housing backlog and the adamant cycle of relocation, resettlement and returns to the cities of informal settler families due to the vast inaccessibility of necessities and opportunities in the past off-city housing projects. Bistekville 2 has been established as a model socialized housing project by utilizing government partnerships with private developers and individuals in the first in-city and onsite resettlement effort in the country. The study looked into the resilience of the residents to idiosyncratic economic shocks by analyzing their vulnerabilities, assets and coping strategies. The study formulated an economic resilience framework to identify how these factors that interact to build the household’s capacity to positively adapt to sudden expenses in their households. The framework is supplemented with a scale that presents the proximity of the household to resilience by identifying through its indicators whether the households are in the level of subsistence, coping, adaptive or transformative. Survey interviews were conducted with 91 households from Bistekville 2 on the components that have been identified by the framework that was processed with qualitative and quantitative processes. The study has found that the households are highly vulnerable due to their family composition and other conditions such as unhealthy loans, inconsistent amortization payment. Along with their high vulnerability, the households have inadequate strategies to anticipate shocks and primarily react to the shock. This has led to the conclusion that the households do not reflect resilience to idiosyncratic economic shocks and are still at the level of coping.

Keywords: idiosyncratic economic shocks, socialized housing, economic resilience, economic vulnerability, adaptive capacity

Procedia PDF Downloads 142
7337 Prevention of Road Accidents by Computerized Drowsiness Detection System

Authors: Ujjal Chattaraj, P. C. Dasbebartta, S. Bhuyan

Abstract:

This paper aims to propose a method to detect the action of the driver’s eyes, using the concept of face detection. There are three major key contributing methods which can rapidly process the framework of the facial image and hence produce results which further can program the reactions of the vehicles as pre-programmed for the traffic safety. This paper compares and analyses the methods on the basis of their reaction time and their ability to deal with fluctuating images of the driver. The program used in this study is simple and efficient, built using the AdaBoost learning algorithm. Through this program, the system would be able to discard background regions and focus on the face-like regions. The results are analyzed on a common computer which makes it feasible for the end users. The application domain of this experiment is quite wide, such as detection of drowsiness or influence of alcohols in drivers or detection for the case of identification.

Keywords: AdaBoost learning algorithm, face detection, framework, traffic safety

Procedia PDF Downloads 154
7336 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes

Authors: Misra Ayse Adsiz, Selim Selvi

Abstract:

In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.

Keywords: agile, design, missile, scrum

Procedia PDF Downloads 164
7335 Dynamic vs. Static Bankruptcy Prediction Models: A Dynamic Performance Evaluation Framework

Authors: Mohammad Mahdi Mousavi

Abstract:

Bankruptcy prediction models have been implemented for continuous evaluation and monitoring of firms. With the huge number of bankruptcy models, an extensive number of studies have focused on answering the question that which of these models are superior in performance. In practice, one of the drawbacks of existing comparative studies is that the relative assessment of alternative bankruptcy models remains an exercise that is mono-criterion in nature. Further, a very restricted number of criteria and measure have been applied to compare the performance of competing bankruptcy prediction models. In this research, we overcome these methodological gaps through implementing an extensive range of criteria and measures for comparison between dynamic and static bankruptcy models, and through proposing a multi-criteria framework to compare the relative performance of bankruptcy models in forecasting firm distress for UK firms.

Keywords: bankruptcy prediction, data envelopment analysis, performance criteria, performance measures

Procedia PDF Downloads 243
7334 Developing English L2 Critical Reading and Thinking Skills through the PISA Reading Literacy Assessment Framework: A Case Study of EFL Learners in a Thai University

Authors: Surasak Khamkhong

Abstract:

This study aimed to investigate the use of the PISA reading literacy assessment framework (PRF) to improve EFL learners’ critical reading and thinking skills. The sample group, selected by the purposive sampling technique, included 36 EFL learners from a university in Northeastern Thailand. The instruments consisted of 8 PRF-based reading lessons, a 27-item-PRF-based reading test which was used as a pre-test and a post-test, and an attitude questionnaire toward the designed lessons. The statistics used for data analysis were percentage, mean, standard deviation, and the Wilcoxon signed-rank test. The results revealed that before the intervention, the students’ English reading proficiency were low as is evident from their low pre-test scores (M=14.00). They did fairly well for the access-and-retrieve questions (M=6.11), but poorly for the integrate-and-interpret questions (M=4.89) and the reflect-and-evaluate questions (M=3.00), respectively. This means that the students could comprehend the texts but they could hardly interpret or evaluate them. However, after the intervention, they could do better as their post-test scores were higher (M=18.01). They could comprehend (M=6.78), interpret (M=6.00) and evaluate (M=5.25) well. This means that after the intervention, their critical reading skills had improved. In terms of their attitude towards the designed lessons and instruction, most students were satisfied with the lessons and the instruction. It may thus be concluded that the designed lessons can help improve students’ English critical reading proficiency and may be used as a teaching model for improving EFL learners’ critical reading skills.

Keywords: second language reading, critical reading and thinking skills, PISA reading literacy framework, English L2 reading development

Procedia PDF Downloads 184
7333 Unsupervised Text Mining Approach to Early Warning System

Authors: Ichihan Tai, Bill Olson, Paul Blessner

Abstract:

Traditional early warning systems that alarm against crisis are generally based on structured or numerical data; therefore, a system that can make predictions based on unstructured textual data, an uncorrelated data source, is a great complement to the traditional early warning systems. The Chicago Board Options Exchange (CBOE) Volatility Index (VIX), commonly referred to as the fear index, measures the cost of insurance against market crash, and spikes in the event of crisis. In this study, news data is consumed for prediction of whether there will be a market-wide crisis by predicting the movement of the fear index, and the historical references to similar events are presented in an unsupervised manner. Topic modeling-based prediction and representation are made based on daily news data between 1990 and 2015 from The Wall Street Journal against VIX index data from CBOE.

Keywords: early warning system, knowledge management, market prediction, topic modeling.

Procedia PDF Downloads 333
7332 Identifying Enablers and Barriers of Healthcare Knowledge Transfer: A Systematic Review

Authors: Yousuf Nasser Al Khamisi

Abstract:

Purpose: This paper presents a Knowledge Transfer (KT) Framework in healthcare sectors by applying a systematic literature review process to the healthcare organizations domain to identify enablers and barriers of KT in Healthcare. Methods: The paper conducted a systematic literature search of peer-reviewed papers that described key elements of KT using four databases (Medline, Cinahl, Scopus, and Proquest) for a 10-year period (1/1/2008–16/10/2017). The results of the literature review were used to build a conceptual framework of KT in healthcare organizations. The author used a systematic review of the literature, as described by Barbara Kitchenham in Procedures for Performing Systematic Reviews. Findings: The paper highlighted the impacts of using Knowledge Management (KM) concept at a healthcare organization in controlling infectious diseases in hospitals, improving family medicine performance and enhancing quality improvement practices. Moreover, it found that good-coding performance is analytically linked with a knowledge sharing network structure rich in brokerage and hierarchy rather than in density. The unavailability or ignored of the latest evidence on more cost-effective or more efficient delivery approaches leads to increase the healthcare costs and may lead to unintended results. Originality: Search procedure produced 12,093 results, of which 3523 were general articles about KM and KT. The titles and abstracts of these articles had been screened to segregate what is related and what is not. 94 articles identified by the researchers for full-text assessment. The total number of eligible articles after removing un-related articles was 22 articles.

Keywords: healthcare organisation, knowledge management, knowledge transfer, KT framework

Procedia PDF Downloads 134
7331 Analysis of Ecological Footprint of Residents for Urban Spatial Restructuring

Authors: Taehyun Kim, Hyunjoo Park, Taehyun Kim

Abstract:

Since the rapid economic development, Korea has recently entered a period of low growth due to population decline and aging. Due to the urbanization around the metropolitan area and the hollowing of local cities, the ecological capacity of a city is decreasing while ecological footprints are increasing, requiring a compact space plan for maintaining urban functions. The purpose of this study is to analyze the relationship between urban spatial structure and residents' ecological footprints for sustainable spatial planning. To do this, we try to analyze the relationship between intra-urban spatial structure, such as net/gross density and service accessibility, and resident ecological footprints of food, housing, transportation, goods and services through survey and structural equation modeling. The results of the study will be useful in establishing an implementation plan for sustainable development goals (SDGs), especially for sustainable cities and communities (SDG 11) and responsible consumption and production (SDG 12) in the future.

Keywords: ecological footprint, structural equation modeling, survey, sustainability, urban spatial structure

Procedia PDF Downloads 261
7330 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks

Authors: Van Trieu, Shouhuai Xu, Yusheng Feng

Abstract:

Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.

Keywords: causality, multilevel graph, cyber-attacks, prediction

Procedia PDF Downloads 154
7329 Predicting Bridge Pier Scour Depth with SVM

Authors: Arun Goel

Abstract:

Prediction of maximum local scour is necessary for the safety and economical design of the bridges. A number of equations have been developed over the years to predict local scour depth using laboratory data and a few pier equations have also been proposed using field data. Most of these equations are empirical in nature as indicated by the past publications. In this paper, attempts have been made to compute local depth of scour around bridge pier in dimensional and non-dimensional form by using linear regression, simple regression and SVM (Poly and Rbf) techniques along with few conventional empirical equations. The outcome of this study suggests that the SVM (Poly and Rbf) based modeling can be employed as an alternate to linear regression, simple regression and the conventional empirical equations in predicting scour depth of bridge piers. The results of present study on the basis of non-dimensional form of bridge pier scour indicates the improvement in the performance of SVM (Poly and Rbf) in comparison to dimensional form of scour.

Keywords: modeling, pier scour, regression, prediction, SVM (Poly and Rbf kernels)

Procedia PDF Downloads 447
7328 On-Ice Force-Velocity Modeling Technical Considerations

Authors: Dan Geneau, Mary Claire Geneau, Seth Lenetsky, Ming -Chang Tsai, Marc Klimstra

Abstract:

Introduction— Horizontal force-velocity profiling (HFVP) involves modeling an athletes linear sprint kinematics to estimate valuable maximum force and velocity metrics. This approach to performance modeling has been used in field-based team sports and has recently been introduced to ice-hockey as a forward skating performance assessment. While preliminary data has been collected on ice, distance constraints of the on-ice test restrict the ability of the athletes to reach their maximal velocity which result in limits of the model to effectively estimate athlete performance. This is especially true of more elite athletes. This report explores whether athletes on-ice are able to reach a velocity plateau similar to what has been seen in overground trials. Fourteen male Major Junior ice-hockey players (BW= 83.87 +/- 7.30 kg, height = 188 ± 3.4cm cm, age = 18 ± 1.2 years n = 14) were recruited. For on-ice sprints, participants completed a standardized warm-up consisting of skating and dynamic stretching and a progression of three skating efforts from 50% to 95%. Following the warm-up, participants completed three on ice 45m sprints, with three minutes of rest in between each trial. For overground sprints, participants completed a similar dynamic warm-up to that of on-ice trials. Following the warm-up participants completed three 40m overground sprint trials. For each trial (on-ice and overground), radar was used to collect instantaneous velocity (Stalker ATS II, Texas, USA) aimed at the participant’s waist. Sprint velocities were modelled using custom Python (version 3.2) script using a mono-exponential function, similar to previous work. To determine if on-ice tirals were achieving a maximum velocity (plateau), minimum acceleration values of the modeled data at the end of the sprint were compared (using paired t-test) between on-ice and overground trials. Significant differences (P<0.001) between overground and on-ice minimum accelerations were observed. It was found that on-ice trials consistently reported higher final acceleration values, indicating a maximum maintained velocity (plateau) had not been reached. Based on these preliminary findings, it is suggested that reliable HFVP metrics cannot yet be collected from all ice-hockey populations using current methods. Elite male populations were not able to achieve a velocity plateau similar to what has been seen in overground trials, indicating the absence of a maximum velocity measure. With current velocity and acceleration modeling techniques, including a dependency of a velocity plateau, these results indicate the potential for error in on-ice HFVP measures. Therefore, these findings suggest that a greater on-ice sprint distance may be required or the need for other velocity modeling techniques, where maximal velocity is not required for a complete profile.   

Keywords: ice-hockey, sprint, skating, power

Procedia PDF Downloads 94
7327 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O'Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient

Procedia PDF Downloads 281
7326 A Machine Learning-Based Analysis of Autism Prevalence Rates across US States against Multiple Potential Explanatory Variables

Authors: Ronit Chakraborty, Sugata Banerji

Abstract:

There has been a marked increase in the reported prevalence of Autism Spectrum Disorder (ASD) among children in the US over the past two decades. This research has analyzed the growth in state-level ASD prevalence against 45 different potentially explanatory factors, including socio-economic, demographic, healthcare, public policy, and political factors. The goal was to understand if these factors have adequate predictive power in modeling the differential growth in ASD prevalence across various states and if they do, which factors are the most influential. The key findings of this study include (1) the confirmation that the chosen feature set has considerable power in predicting the growth in ASD prevalence, (2) the identification of the most influential predictive factors, (3) given the nature of the most influential predictive variables, an indication that a considerable portion of the reported ASD prevalence differentials across states could be attributable to over and under diagnosis, and (4) identification of Florida as a key outlier state pointing to a potential under-diagnosis of ASD there.

Keywords: autism spectrum disorder, clustering, machine learning, predictive modeling

Procedia PDF Downloads 95
7325 Study of the Relationship between the Roughness Configuration of Channel Bottom and the Creation of Vortices at the Rough Area: Numerical Modelling

Authors: Youb Said, Fourar Ali

Abstract:

To describe the influence of bottom roughness on the free surface flows by numerical modeling, a two-dimensional model was developed. The equations of continuity and momentum (Naviers Stokes equations) are solved by the finite volume method. We considered a turbulent flow in an open channel with a bottom roughness. For our simulations, the K-ε model was used. After setting the initial and boundary conditions and solve the equations set, we were able to achieve the following results: vortex forming in the hollow causing substantial energy dissipation in the obstacle areas that form the bottom roughness. The comparison of our results with experimental ones shows a good agreement in terms of the results in the rough area. However, in other areas, differences were more or less important. These differences are in areas far from the bottom, especially the free surface area just after the bottom. These disagreements are probably due to experimental constants used by the k-ε model.

Keywords: modeling, free surface flow, turbulence, bottom roughness, finite volume, K-ε model, energy dissipation

Procedia PDF Downloads 376
7324 Safety Analysis and Accident Modeling of Transportation in Srinagar City

Authors: Adinarayana Badveeti, Mohammad Shafi Mir

Abstract:

In Srinagar city, in India, road safety is an important aspect that creates ecological balance and social well being. A road accident creates a situation that leaves behind distress, sorrow, and sufferings. Therefore identification of causes of road accidents becomes highly essential for adopting necessary preventive measures against a critical event. The damage created by road accidents to large extent is unrepairable and therefore needs attention to eradicate this continuously increasing trend of awful 'epidemic'. Road accident in India is among the highest in the world, with at least approximately 142.000 people killed each year on the road. Kashmir region is an ecologically sensitive place but lacks necessary facilities and infrastructure regarding road transportation, ultimately resulting in the critical event-road accidents creating a major problem for common people in the region. The objective of this project is to study the safety aspect of Srinagar City and also model the accidents with different aspect that causes accidents and also to suggest the possible remedies for lessening/eliminating the road accidents.

Keywords: road safety, road accident, road infrastructure, accident modeling

Procedia PDF Downloads 249
7323 Deterministic Modelling to Estimate Economic Impact from Implementation and Management of Large Infrastructure

Authors: Dimitrios J. Dimitriou

Abstract:

It is widely recognised that the assets portfolio development is helping to enhance economic growth, productivity and competitiveness. While numerous studies and reports certify the positive effect of investments in large infrastructure investments on the local economy, still, the methodology to estimate the contribution in economic development is a challenging issue for researchers and economists. The key question is how to estimate those economic impacts in each economic system. This paper provides a compact and applicable methodological framework providing quantitative results in terms of the overall jobs and income generated into the project life cycle. According to a deterministic mathematical approach, the key variables and the modelling framework are presented. The numerical case study highlights key results for a new motorway project in Greece, which is experienced economic stress for many years, providing the opportunity for comparisons with similar cases.

Keywords: quantitative modelling, economic impact, large transport infrastructure, economic assessment

Procedia PDF Downloads 196
7322 Studying the Theoretical and Laboratory Design of a Concrete Frame and Optimizing Its Design for Impact and Earthquake Resistance

Authors: Mehrdad Azimzadeh, Seyed Mohammadreza Jabbari, Mohammadreza Hosseinzadeh Alherd

Abstract:

This paper includes experimental results and analytical studies about increasing resistance of single-span reinforced concreted frames against impact factor and their modeling according to optimization methods and optimizing the behavior of these frames under impact loads. During this study, about 30 designs for different frames were modeled and made using specialized software like ANSYS and Sap and their behavior were examined under variable impacts. Then suitable strategies were offered for frames in terms of concrete mixing in order to optimize frame modeling. To reduce the weight of the frames, we had to use fine-grained stones. After designing about eight types of frames for each type of frames, three samples were designed with the aim of controlling the impact strength parameters, and a good shape of the frame was created for the impact resistance, which was a solid frame with muscular legs, and as a bond away from each other as much as possible with a 3 degree gradient in the upper part of the beam.

Keywords: optimization, reinforced concrete, optimization methods, impact load, earthquake

Procedia PDF Downloads 177
7321 Comparison of Solar Radiation Models

Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci

Abstract:

Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.

Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)

Procedia PDF Downloads 342
7320 Multi-Level Framework for Effective Use of Stock Ordering System: Case Study of Small Enterprises in Kgautswane

Authors: Lethamaga Tladi, Ray Kekwaletswe

Abstract:

This study sought to conceptualise a multi-level framework for the effective use of stock ordering system in small enterprises in a rural area context. The interpretive research methodology has been used to enable the researcher to analyse, in-depth, and the subjective meanings of small enterprises’ employees in using the stock ordering system. The empirical data was collected from 13 small enterprises’ employees as participants through semi-structured interviews and observations. Interpretive Phenomenological Analysis (IPA) approach was used to analyse the small enterprises’ employee’s own account of lived experiences in relations to stock ordering system use in terms of their relatedness to, and cognitive engagement with. A case study of Kgautswane, a rural area in Limpopo Province, South Africa, served as a social context where the phenomenon manifested. Technology-Organisation-Environment Theory (TOE), Technology-to-Performance Chain Model (TPC), and Representation Theory (RT) underpinned this study. In this multi-level study, the findings revealed that; At the organisational level, the effective use of stock ordering system was found to be associated with the organisational performance gains such as efficiency, productivity, quality, competitiveness, and market share. Equally so, at the individual level, the effective use of stock ordering system minimised the end-user’s efforts and time to accomplish their tasks, which yields improved individual performance. The Multi-level framework for effective use of stock ordering system was presented.

Keywords: effective use, multi-dimensions of use, multi-level of use, multi-level research, small enterprises, stock ordering system

Procedia PDF Downloads 164
7319 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games

Authors: Micael Sousa

Abstract:

Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.

Keywords: board games, design thinking, methodology, serious games

Procedia PDF Downloads 104
7318 Machine Learning Approach for Lateralization of Temporal Lobe Epilepsy

Authors: Samira-Sadat JamaliDinan, Haidar Almohri, Mohammad-Reza Nazem-Zadeh

Abstract:

Lateralization of temporal lobe epilepsy (TLE) is very important for positive surgical outcomes. We propose a machine learning framework to ultimately identify the epileptogenic hemisphere for temporal lobe epilepsy (TLE) cases using magnetoencephalography (MEG) coherence source imaging (CSI) and diffusion tensor imaging (DTI). Unlike most studies that use classification algorithms, we propose an effective clustering approach to distinguish between normal and TLE cases. We apply the famous Minkowski weighted K-Means (MWK-Means) technique as the clustering framework. To overcome the problem of poor initialization of K-Means, we use particle swarm optimization (PSO) to effectively select the initial centroids of clusters prior to applying MWK-Means. We demonstrate that compared to K-means and MWK-means independently, this approach is able to improve the result of a benchmark data set.

Keywords: temporal lobe epilepsy, machine learning, clustering, magnetoencephalography

Procedia PDF Downloads 147
7317 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems

Procedia PDF Downloads 113
7316 Feminist Evaluation: The Case of Mahatma Gandhi National Rural Employment Guarantee Act

Authors: Salam Abukhadrah

Abstract:

This research advocates for the use of feminist evaluation (FE) as a tool of great potential in policy and program assessment in relation to women’s empowerment. This research explores the journey of women’s place into the evaluation and international development. Moreover, this research presents a case example of the use of FE on the Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA), in Ganaparthi village in rural India, in Andhra Pradesh state (AP). This evaluation is formed on the basis of women’s empowerment framework that seeks to examine empowerment as a process and an end in itself rather than as just simplified quantifiable outcomes. This framework is used to conduct in-depth semi-structured interviews that are later cross-validated by a focus group discussion. In addition, this evaluation draws on secondary data from the MGNREGA website and on extracted data from the National Family Health Survey of AP.

Keywords: feminist evaluation, MGNREGA, women’s empowerment, case example, India

Procedia PDF Downloads 137
7315 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 424
7314 Addressing Coastal Community Vulnerabilities with Alternative Marine Energy Projects

Authors: Danielle Preziuso, Kamila Kazimierczuk, Annalise Stein, Bethel Tarekegne

Abstract:

Coastal communities experience a variety of distinct socioeconomic, technical, and environmental vulnerabilities, all of which accrue heightened risk with increasingly frequent and severe climate change impacts. Marine renewable energy (MRE) offers a potential solution for mitigating coastal community vulnerabilities, especially water-energy dependencies while delivering promising co-benefits such as increased resilience and more sustainable energy outcomes. This paper explores coastal community vulnerabilities and service dependencies based on the local drivers that create them, with attention to climate change impacts and how they catalyze water-energy unmet needs in these communities. We examine the vulnerabilities through the lens of coastal Tribal communities (i.e., the Makah Tribe, the Kenaitze Tribe, Quinault Nation), as indigenous communities often face compounded impacts of technical, economic, and environmental vulnerabilities due to their underlying socio-demographic inequalities. We offer an environmental and energy justice indicators framework to understand how these vulnerabilities disproportionately manifest and impact the most vulnerable community members, and we subsequently utilize the framework to inform a weighted decision matrix tool that compares the viability of MRE-based alternative energy futures in addressing these vulnerabilities. The framework and complementary tool highlight opportunities for future MRE research and pilot demonstrations that directly respond to the vulnerabilities of coastal communities.

Keywords: coastal communities, decision matrix, energy equity, energy vulnerability, marine energy, service dependency

Procedia PDF Downloads 72
7313 Developing a Framework to Aid Sustainable Assessment in Indian Buildings

Authors: P. Amarnath, Albert Thomas

Abstract:

Buildings qualify to be the major consumer of energy and resources thereby urging the designers, architects and policy makers to place a great deal of effort in achieving and implementing sustainable building strategies in construction. Green building rating systems help a great deal in this by measuring the effectiveness of these strategies along with the escalation of building performance in social, environmental and economic perspective, and construct new sustainable buildings. However, for a country like India, enormous population and its rapid rate of growth impose an increasing burden on the country's limited and continuously degrading natural resource base, which also includes the land available for construction. In general, the number of sustainable rated buildings in India is very minimal primarily due to the complexity and obstinate nature of the assessment systems/regulations that restrict the stakeholders and designers in proper implementation and utilization of these rating systems. This paper aims to introduce a data driven and user-friendly framework which cross compares the present prominent green building rating systems such as LEED, BREEAM, and GRIHA and subsequently help the users to rate their proposed building design as per the regulations of these assessment frameworks. This framework is validated using the input data collected from green buildings constructed globally. The proposed system has prospects to encourage the users to test the efficiency of various sustainable construction practices and thereby promote more sustainable buildings in the country.

Keywords: BREEAM, GRIHA, green building rating systems, LEED, sustainable buildings

Procedia PDF Downloads 130
7312 A Framework for Review Spam Detection Research

Authors: Mohammadali Tavakoli, Atefeh Heydari, Zuriati Ismail, Naomie Salim

Abstract:

With the increasing number of people reviewing products online in recent years, opinion sharing websites has become the most important source of customers’ opinions. Unfortunately, spammers generate and post fake reviews in order to promote or demote brands and mislead potential customers. These are notably destructive not only for potential customers but also for business holders and manufacturers. However, research in this area is not adequate, and many critical problems related to spam detection have not been solved to date. To provide green researchers in the domain with a great aid, in this paper, we have attempted to create a high-quality framework to make a clear vision on review spam-detection methods. In addition, this report contains a comprehensive collection of detection metrics used in proposed spam-detection approaches. These metrics are extremely applicable for developing novel detection methods.

Keywords: fake reviews, feature collection, opinion spam, spam detection

Procedia PDF Downloads 407
7311 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field

Authors: Yana Snegireva

Abstract:

Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.

Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model

Procedia PDF Downloads 72