Search results for: probability bivariant models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7613

Search results for: probability bivariant models

7163 Employers’ Preferences when Employing Solo Self-employed: a Vignette Study in the Netherlands

Authors: Lian Kösters, Wendy Smits, Raymond Montizaan

Abstract:

The number of solo self-employed in the Netherlands has been increasing for years. The relative increase is among the largest in the EU. To explain this increase, most studies have focused on the supply side, workers who offer themselves as solo self-employed. The number of studies that focus on the demand side, the employer who hires the solo self-employed, is still scarce. Studies into employer behaviour conducted until now show that employers mainly choose self-employed workers when they have a temporary need for specialist knowledge, but also during projects or production peaks. These studies do not provide insight into the employers’ considerations for different contract types. In this study, interviews with employers were conducted, and available literature was consulted to provide an overview of the several factors employers use to compare different contract types. That input was used to set up a vignette study. This was carried out at the end of 2021 among almost 1000 business owners, HR managers, and business leaders of Dutch companies. Each respondent was given two sets of five fictitious candidates for two possible positions in their organization. They were asked to rank these candidates. The positions varied with regard to the type of tasks (core tasks or support tasks) and the time it took to train new people for the position. The respondents were asked additional questions about the positions, such as the required level of education, the duration, and the degree of predictability of tasks. The fictitious candidates varied, among other things, in the type of contract on which they would come to work for the organization. The results were analyzed using a rank-ordered logit analysis. This vignette setup makes it possible to see which factors are most important for employers when choosing to hire a solo self-employed person compared to other contracts. The results show that there are no indications that employers would want to hire solo self-employed workers en masse. They prefer regular employee contracts. The probability of being chosen with a solo self-employed contract over someone who comes to work as a temporary employee is 32 percent. This probability is even lower than for on-call and temporary agency workers. For a permanent contract, this probability is 46 percent. The results provide indications that employers consider knowledge and skills more important than the solo self-employed contract and that this can compensate. A solo self-employed candidate with 10 years of work experience has a 63 percent probability of being found attractive by an employer compared to a temporary employee without work experience. This suggests that employers are willing to give someone a less attractive contract for the employer if the worker so wishes. The results also show that the probability that a solo self-employed person is preferred over a candidate with a temporary employee contract is somewhat higher in business economics, administrative and technical professions. No significant results were found for factors where it was expected that solo self-employed workers are preferred more often, such as for unpredictable or temporary work.

Keywords: employer behaviour, rank-ordered logit analysis, solo self-employment, temporary contract, vignette study

Procedia PDF Downloads 65
7162 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 481
7161 D-Wave Quantum Computing Ising Model: A Case Study for Forecasting of Heat Waves

Authors: Dmytro Zubov, Francesco Volponi

Abstract:

In this paper, D-Wave quantum computing Ising model is used for the forecasting of positive extremes of daily mean air temperature. Forecast models are designed with two to five qubits, which represent 2-, 3-, 4-, and 5-day historical data respectively. Ising model’s real-valued weights and dimensionless coefficients are calculated using daily mean air temperatures from 119 places around the world, as well as sea level (Aburatsu, Japan). In comparison with current methods, this approach is better suited to predict heat wave values because it does not require the estimation of a probability distribution from scarce observations. Proposed forecast quantum computing algorithm is simulated based on traditional computer architecture and combinatorial optimization of Ising model parameters for the Ronald Reagan Washington National Airport dataset with 1-day lead-time on learning sample (1975-2010 yr). Analysis of the forecast accuracy (ratio of successful predictions to total number of predictions) on the validation sample (2011-2014 yr) shows that Ising model with three qubits has 100 % accuracy, which is quite significant as compared to other methods. However, number of identified heat waves is small (only one out of nineteen in this case). Other models with 2, 4, and 5 qubits have 20 %, 3.8 %, and 3.8 % accuracy respectively. Presented three-qubit forecast model is applied for prediction of heat waves at other five locations: Aurel Vlaicu, Romania – accuracy is 28.6 %; Bratislava, Slovakia – accuracy is 21.7 %; Brussels, Belgium – accuracy is 33.3 %; Sofia, Bulgaria – accuracy is 50 %; Akhisar, Turkey – accuracy is 21.4 %. These predictions are not ideal, but not zeros. They can be used independently or together with other predictions generated by different method(s). The loss of human life, as well as environmental, economic, and material damage, from extreme air temperatures could be reduced if some of heat waves are predicted. Even a small success rate implies a large socio-economic benefit.

Keywords: heat wave, D-wave, forecast, Ising model, quantum computing

Procedia PDF Downloads 492
7160 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data

Authors: M. A. Meslem

Abstract:

For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.

Keywords: quasigeoid, gravity aomalies, covariance, GGM

Procedia PDF Downloads 131
7159 Machine Learning Driven Analysis of Kepler Objects of Interest to Identify Exoplanets

Authors: Akshat Kumar, Vidushi

Abstract:

This paper identifies 27 KOIs, 26 of which are currently classified as candidates and one as false positives that have a high probability of being confirmed. For this purpose, 11 machine learning algorithms were implemented on the cumulative kepler dataset sourced from the NASA exoplanet archive; it was observed that the best-performing model was HistGradientBoosting and XGBoost with a test accuracy of 93.5%, and the lowest-performing model was Gaussian NB with a test accuracy of 54%, to test model performance F1, cross-validation score and RUC curve was calculated. Based on the learned models, the significant characteristics for confirm exoplanets were identified, putting emphasis on the object’s transit and stellar properties; these characteristics were namely koi_count, koi_prad, koi_period, koi_dor, koi_ror, and koi_smass, which were later considered to filter out the potential KOIs. The paper also calculates the Earth similarity index based on the planetary radius and equilibrium temperature for each KOI identified to aid in their classification.

Keywords: Kepler objects of interest, exoplanets, space exploration, machine learning, earth similarity index, transit photometry

Procedia PDF Downloads 61
7158 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models

Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai

Abstract:

Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.

Keywords: plant identification, CNN, image processing, vision transformer, classification

Procedia PDF Downloads 90
7157 Study of Seismic Damage Reinforced Concrete Frames in Variable Height with Logistic Statistic Function Distribution

Authors: P. Zarfam, M. Mansouri Baghbaderani

Abstract:

In seismic design, the proper reaction to the earthquake and the correct and accurate prediction of its subsequent effects on the structure are critical. Choose a proper probability distribution, which gives a more realistic probability of the structure's damage rate, is essential in damage discussions. With the development of design based on performance, analytical method of modal push over as an inexpensive, efficacious, and quick one in the estimation of the structures' seismic response is broadly used in engineering contexts. In this research three concrete frames of 3, 6, and 13 stories are analyzed in non-linear modal push over by 30 different earthquake records by OpenSEES software, then the detriment indexes of roof's displacement and relative displacement ratio of the stories are calculated by two parameters: peak ground acceleration and spectra acceleration. These indexes are used to establish the value of damage relations with log-normal distribution and logistics distribution. Finally the value of these relations is compared and the effect of height on the mentioned damage relations is studied, too.

Keywords: modal pushover analysis, concrete structure, seismic damage, log-normal distribution, logistic distribution

Procedia PDF Downloads 242
7156 Measuring Environmental Efficiency of Energy in OPEC Countries

Authors: Bahram Fathi, Seyedhossein Sajadifar, Naser Khiabani

Abstract:

Data envelopment analysis (DEA) has recently gained popularity in energy efficiency analysis. A common feature of the previously proposed DEA models for measuring energy efficiency performance is that they treat energy consumption as an input within a production framework without considering undesirable outputs. However, energy use results in the generation of undesirable outputs as byproducts of producing desirable outputs. Within a joint production framework of both desirable and undesirable outputs, this paper presents several DEA-type linear programming models for measuring energy efficiency performance. In addition to considering undesirable outputs, our models treat different energy sources as different inputs so that changes in energy mix could be accounted for in evaluating energy efficiency. The proposed models are applied to measure the energy efficiency performances of 12 OPEC countries and the results obtained are presented.

Keywords: energy efficiency, undesirable outputs, data envelopment analysis

Procedia PDF Downloads 730
7155 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard

Authors: Arash Gharibi

Abstract:

Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.

Keywords: metamodel, modeling, interoperability, reuse

Procedia PDF Downloads 194
7154 Implied Adjusted Volatility by Leland Option Pricing Models: Evidence from Australian Index Options

Authors: Mimi Hafizah Abdullah, Hanani Farhah Harun, Nik Ruzni Nik Idris

Abstract:

With the implied volatility as an important factor in financial decision-making, in particular in option pricing valuation, and also the given fact that the pricing biases of Leland option pricing models and the implied volatility structure for the options are related, this study considers examining the implied adjusted volatility smile patterns and term structures in the S&P/ASX 200 index options using the different Leland option pricing models. The examination of the implied adjusted volatility smiles and term structures in the Australian index options market covers the global financial crisis in the mid-2007. The implied adjusted volatility was found to escalate approximately triple the rate prior the crisis.

Keywords: implied adjusted volatility, financial crisis, Leland option pricing models, Australian index options

Procedia PDF Downloads 372
7153 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information

Procedia PDF Downloads 809
7152 Evaluation of Environmental, Technical, and Economic Indicators of a Fused Deposition Modeling Process

Authors: M. Yosofi, S. Ezeddini, A. Ollivier, V. Lavaste, C. Mayousse

Abstract:

Additive manufacturing processes have changed significantly in a wide range of industries and their application progressed from rapid prototyping to production of end-use products. However, their environmental impact is still a rather open question. In order to support the growth of this technology in the industrial sector, environmental aspects should be considered and predictive models may help monitor and reduce the environmental footprint of the processes. This work presents predictive models based on a previously developed methodology for the environmental impact evaluation combined with a technical and economical assessment. Here we applied the methodology to the Fused Deposition Modeling process. First, we present the predictive models relative to different types of machines. Then, we present a decision-making tool designed to identify the optimum manufacturing strategy regarding technical, economic, and environmental criteria.

Keywords: additive manufacturing, decision-makings, environmental impact, predictive models

Procedia PDF Downloads 122
7151 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis

Authors: Touila Ahmed, Elie Louis, Hamza Gharbi

Abstract:

State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.

Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision

Procedia PDF Downloads 178
7150 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five

Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz

Abstract:

Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.

Keywords: hydroxyl, global model, model maintenance, near infrared, polyol

Procedia PDF Downloads 129
7149 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue

Authors: Rachel Y. Zhang, Christopher K. Anderson

Abstract:

A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.

Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine

Procedia PDF Downloads 124
7148 Text Similarity in Vector Space Models: A Comparative Study

Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge

Abstract:

Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.

Keywords: big data, patent, text embedding, text similarity, vector space model

Procedia PDF Downloads 165
7147 Pawn or Potentates: Corporate Governance Structure in Indian Central Public Sector Enterprises

Authors: Ritika Jain, Rajnish Kumar

Abstract:

The Department of Public Enterprises had made submissions of Self Evaluation Reports, for the purpose of corporate governance, mandatory for all central government owned enterprises. Despite this, an alarming 40% of the enterprises did not do so. This study examines the impact of external policy tools and internal firm-specific factors on corporate governance of central public sector enterprises (CPSEs). We use a dataset of all manufacturing and non-financial services owned by the central government of India for the year 2010-11. Using probit, ordered logit and Heckman’s sample selection models, the study finds that the probability and quality of corporate governance is positively influenced by the CPSE getting into a Memorandum of Understanding (MoU) with the central government of India, and hence, enjoying more autonomy in terms of day to day operations. Besides these, internal factors, including bigger size and lower debt size contribute significantly to better corporate governance.

Keywords: corporate governance, central public sector enterprises (CPSEs), sample selection, Memorandum of Understanding (MoU), ordered logit, disinvestment

Procedia PDF Downloads 250
7146 Mathematical Model of Corporate Bond Portfolio and Effective Border Preview

Authors: Sergey Podluzhnyy

Abstract:

One of the most important tasks of investment and pension fund management is building decision support system which helps to make right decision on corporate bond portfolio formation. Today there are several basic methods of bond portfolio management. They are duration management, immunization and convexity management. Identified methods have serious disadvantage: they do not take into account credit risk or insolvency risk of issuer. So, identified methods can be applied only for management and evaluation of high-quality sovereign bonds. Applying article proposes mathematical model for building an optimal in case of risk and yield corporate bond portfolio. Proposed model takes into account the default probability in formula of assessment of bonds which results to more correct evaluation of bonds prices. Moreover, applied model provides tools for visualization of the efficient frontier of corporate bonds portfolio taking into account the exposure to credit risk, which will increase the quality of the investment decisions of portfolio managers.

Keywords: corporate bond portfolio, default probability, effective boundary, portfolio optimization task

Procedia PDF Downloads 316
7145 Geographic Information System for District Level Energy Performance Simulations

Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck

Abstract:

The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.

Keywords: CityGML, EnergyADE, energy performance simulation, GIS

Procedia PDF Downloads 162
7144 Groundwater Potential Mapping using Frequency Ratio and Shannon’s Entropy Models in Lesser Himalaya Zone, Nepal

Authors: Yagya Murti Aryal, Bipin Adhikari, Pradeep Gyawali

Abstract:

The Lesser Himalaya zone of Nepal consists of thrusting and folding belts, which play an important role in the sustainable management of groundwater in the Himalayan regions. The study area is located in the Dolakha and Ramechhap Districts of Bagmati Province, Nepal. Geologically, these districts are situated in the Lesser Himalayas and partly encompass the Higher Himalayan rock sequence, which includes low-grade to high-grade metamorphic rocks. Following the Gorkha Earthquake in 2015, numerous springs dried up, and many others are currently experiencing depletion due to the distortion of the natural groundwater flow. The primary objective of this study is to identify potential groundwater areas and determine suitable sites for artificial groundwater recharge. Two distinct statistical approaches were used to develop models: The Frequency Ratio (FR) and Shannon Entropy (SE) methods. The study utilized both primary and secondary datasets and incorporated significant role and controlling factors derived from field works and literature reviews. Field data collection involved spring inventory, soil analysis, lithology assessment, and hydro-geomorphology study. Additionally, slope, aspect, drainage density, and lineament density were extracted from a Digital Elevation Model (DEM) using GIS and transformed into thematic layers. For training and validation, 114 springs were divided into a 70/30 ratio, with an equal number of non-spring pixels. After assigning weights to each class based on the two proposed models, a groundwater potential map was generated using GIS, classifying the area into five levels: very low, low, moderate, high, and very high. The model's outcome reveals that over 41% of the area falls into the low and very low potential categories, while only 30% of the area demonstrates a high probability of groundwater potential. To evaluate model performance, accuracy was assessed using the Area under the Curve (AUC). The success rate AUC values for the FR and SE methods were determined to be 78.73% and 77.09%, respectively. Additionally, the prediction rate AUC values for the FR and SE methods were calculated as 76.31% and 74.08%. The results indicate that the FR model exhibits greater prediction capability compared to the SE model in this case study.

Keywords: groundwater potential mapping, frequency ratio, Shannon’s Entropy, Lesser Himalaya Zone, sustainable groundwater management

Procedia PDF Downloads 70
7143 Fast and Accurate Model to Detect Ictal Waveforms in Electroencephalogram Signals

Authors: Piyush Swami, Bijaya Ketan Panigrahi, Sneh Anand, Manvir Bhatia, Tapan Gandhi

Abstract:

Visual inspection of electroencephalogram (EEG) signals to detect epileptic signals is very challenging and time-consuming task even for any expert neurophysiologist. This problem is most challenging in under-developed and developing countries due to shortage of skilled neurophysiologists. In the past, notable research efforts have gone in trying to automate the seizure detection process. However, due to high false alarm detections and complexity of the models developed so far, have vastly delimited their practical implementation. In this paper, we present a novel scheme for epileptic seizure detection using empirical mode decomposition technique. The intrinsic mode functions obtained were then used to calculate the standard deviations. This was followed by probability density based classifier to discriminate between non-ictal and ictal patterns in EEG signals. The model presented here demonstrated very high classification rates ( > 97%) without compromising the statistical performance. The computation timings for each testing phase were also very low ( < 0.029 s) which makes this model ideal for practical applications.

Keywords: electroencephalogram (EEG), epilepsy, ictal patterns, empirical mode decomposition

Procedia PDF Downloads 399
7142 Talent-to-Vec: Using Network Graphs to Validate Models with Data Sparsity

Authors: Shaan Khosla, Jon Krohn

Abstract:

In a recruiting context, machine learning models are valuable for recommendations: to predict the best candidates for a vacancy, to match the best vacancies for a candidate, and compile a set of similar candidates for any given candidate. While useful to create these models, validating their accuracy in a recommendation context is difficult due to a sparsity of data. In this report, we use network graph data to generate useful representations for candidates and vacancies. We use candidates and vacancies as network nodes and designate a bi-directional link between them based on the candidate interviewing for the vacancy. After using node2vec, the embeddings are used to construct a validation dataset with a ranked order, which will help validate new recommender systems.

Keywords: AI, machine learning, NLP, recruiting

Procedia PDF Downloads 81
7141 Bridging the Gap between Different Interfaces for Business Process Modeling

Authors: Katalina Grigorova, Kaloyan Mironov

Abstract:

The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.

Keywords: business process modeling, business process modeling standards, workflow patterns, converting models

Procedia PDF Downloads 573
7140 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025

Authors: Shruti Chopra, Vikas Rao Vadi

Abstract:

In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.

Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking

Procedia PDF Downloads 143
7139 The Predictive Significance of Metastasis Associated in Colon Cancer-1 (MACC1) in Primary Breast Cancer

Authors: Jasminka Mujic, Karin Milde-Langosch, Volkmar Mueller, Mirza Suljagic, Tea Becirevic, Jozo Coric, Daria Ler

Abstract:

MACC1 (metastasis associated in colon cancer-1) is a prognostic biomarker for tumor progression, metastasis, and survival of a variety of solid cancers. MACC1 also causes tumor growth in xenograft models and acts as a master regulator of the HGF/MET signaling pathway. In breast cancer, the expression of MACC1 determined by immunohistochemistry was significantly associated with positive lymph node status and advanced clinical stage. The aim of the present study was to further investigate the prognostic or predictive value of MACC1 expression in breast cancer using western blot analysis and immunohistochemistry. The results of our study have shown that high MACC1 expression in breast cancer is associated with shorter disease-free survival, especially in node-negative tumors. The MACC1 might be a suitable biomarker to select patients with a higher probability of recurrence which might benefit from adjuvant chemotherapy. Our results support a biologic role and potentially open the perspective for the use of MACC1 as predictive biomarker for treatment decision in breast cancer patients.

Keywords: breast cancer, biomarker, HGF/MET, MACC1

Procedia PDF Downloads 224
7138 Hybrid Project Management Model Based on Lean and Agile Approach

Authors: Fatima-Zahra Eddoug, Jamal Benhra, Rajaa Benabbou

Abstract:

Several project management models exist in the literature and the most used ones are the hybrids for their multiple advantages. Our objective in this paper is to analyze the existing models, which are based on the Lean and Agile approaches and to propose a novel framework with the convenient tools that will allow efficient management of a general project. To create the desired framework, we were based essentially on 7 existing models. Only the Scrum tool among the agile tools was identified by several authors to be appropriate for project management. In contrast, multiple lean tools were proposed in different phases of the project.

Keywords: agility, hybrid project management, lean, scrum

Procedia PDF Downloads 129
7137 The Impact of COVID-19 on Antibiotic Prescribing in Primary Care in England: Evaluation and Risk Prediction of the Appropriateness of Type and Repeat Prescribing

Authors: Xiaomin Zhong, Alexander Pate, Ya-Ting Yang, Ali Fahmi, Darren M. Ashcroft, Ben Goldacre, Brian Mackenna, Amir Mehrkar, Sebastian C. J. Bacon, Jon Massey, Louis Fisher, Peter Inglesby, Kieran Hand, Tjeerd van Staa, Victoria Palin

Abstract:

Background: This study aimed to predict risks of potentially inappropriate antibiotic type and repeat prescribing and assess changes during COVID-19. Methods: With the approval of NHS England, we used the OpenSAFELY platform to access the TPP SystmOne electronic health record (EHR) system and selected patients prescribed antibiotics from 2019 to 2021. Multinomial logistic regression models predicted the patient’s probability of receiving an inappropriate antibiotic type or repeating the antibiotic course for each common infection. Findings: The population included 9.1 million patients with 29.2 million antibiotic prescriptions. 29.1% of prescriptions were identified as repeat prescribing. Those with same-day incident infection coded in the EHR had considerably lower rates of repeat prescribing (18.0%), and 8.6% had a potentially inappropriate type. No major changes in the rates of repeat antibiotic prescribing during COVID-19 were found. In the ten risk prediction models, good levels of calibration and moderate levels of discrimination were found. Important predictors included age, prior antibiotic prescribing, and region. Patients varied in their predicted risks. For sore throat, the range from 2.5 to 97.5th percentile was 2.7 to 23.5% (inappropriate type) and 6.0 to 27.2% (repeat prescription). For otitis externa, these numbers were 25.9 to 63.9% and 8.5 to 37.1%, respectively. Interpretation: Our study found no evidence of changes in the level of inappropriate or repeat antibiotic prescribing after the start of COVID-19. Repeat antibiotic prescribing was frequent and varied according to regional and patient characteristics. There is a need for treatment guidelines to be developed around antibiotic failure and clinicians provided with individualised patient information.

Keywords: antibiotics, infection, COVID-19 pandemic, antibiotic stewardship, primary care

Procedia PDF Downloads 111
7136 Statistical Correlation between Ply Mechanical Properties of Composite and Its Effect on Structure Reliability

Authors: S. Zhang, L. Zhang, X. Chen

Abstract:

Due to the large uncertainty on the mechanical properties of FRP (fibre reinforced plastic), the reliability evaluation of FRP structures are currently receiving much attention in industry. However, possible statistical correlation between ply mechanical properties has been so far overlooked, and they are mostly assumed to be independent random variables. In this study, the statistical correlation between ply mechanical properties of uni-directional and plain weave composite is firstly analyzed by a combination of Monte-Carlo simulation and finite element modeling of the FRP unit cell. Large linear correlation coefficients between the in-plane mechanical properties are observed, and the correlation coefficients are heavily dependent on the uncertainty of the fibre volume ratio. It is also observed that the correlation coefficients related to Poisson’s ratio are negative while others are positive. To experimentally achieve the statistical correlation coefficients between in-plane mechanical properties of FRP, all concerned in-plane mechanical properties of the same specimen needs to be known. In-plane shear modulus of FRP is experimentally derived by the approach suggested in the ASTM standard D5379M. Tensile tests are conducted using the same specimens used for the shear test, and due to non-uniform tensile deformation a modification factor is derived by a finite element modeling. Digital image correlation is adopted to characterize the specimen non-uniform deformation. The preliminary experimental results show a good agreement with the numerical analysis on the statistical correlation. Then, failure probability of laminate plates is calculated in cases considering and not considering the statistical correlation, using the Monte-Carlo and Markov Chain Monte-Carlo methods, respectively. The results highlight the importance of accounting for the statistical correlation between ply mechanical properties to achieve accurate failure probability of laminate plates. Furthermore, it is found that for the multi-layer laminate plate, the statistical correlation between the ply elastic properties significantly affects the laminate reliability while the effect of statistical correlation between the ply strength is minimal.

Keywords: failure probability, FRP, reliability, statistical correlation

Procedia PDF Downloads 154
7135 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements

Authors: Sabiu Bala Muhammad, Rosli Saad

Abstract:

Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.

Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity

Procedia PDF Downloads 269
7134 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques

Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa

Abstract:

This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).

Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences

Procedia PDF Downloads 342