Search results for: demand models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9702

Search results for: demand models

9132 Models of Copyrights System

Authors: A. G. Matveev

Abstract:

The copyrights system is a combination of different elements. The number, content and the correlation of these elements are different for different legal orders. The models of copyrights systems display this system in terms of the interaction of economic and author's moral rights. Monistic and dualistic models are the most popular ones. The article deals with different points of view on the monism and dualism in copyright system. A specific model of the copyright in Switzerland in the XXth century is analyzed. The evolution of a French dualistic model of copyright is shown. The author believes that one should talk not about one, but rather about a number of dualism forms of copyright system.

Keywords: copyright, exclusive copyright, economic rights, author's moral rights, rights of personality, monistic model, dualistic model

Procedia PDF Downloads 420
9131 Semantic Textual Similarity on Contracts: Exploring Multiple Negative Ranking Losses for Sentence Transformers

Authors: Yogendra Sisodia

Abstract:

Researchers are becoming more interested in extracting useful information from legal documents thanks to the development of large-scale language models in natural language processing (NLP), and deep learning has accelerated the creation of powerful text mining models. Legal fields like contracts benefit greatly from semantic text search since it makes it quick and easy to find related clauses. After collecting sentence embeddings, it is relatively simple to locate sentences with a comparable meaning throughout the entire legal corpus. The author of this research investigated two pre-trained language models for this task: MiniLM and Roberta, and further fine-tuned them on Legal Contracts. The author used Multiple Negative Ranking Loss for the creation of sentence transformers. The fine-tuned language models and sentence transformers showed promising results.

Keywords: legal contracts, multiple negative ranking loss, natural language inference, sentence transformers, semantic textual similarity

Procedia PDF Downloads 108
9130 Pilot Induced Oscillations Adaptive Suppression in Fly-By-Wire Systems

Authors: Herlandson C. Moura, Jorge H. Bidinotto, Eduardo M. Belo

Abstract:

The present work proposes the development of an adaptive control system which enables the suppression of Pilot Induced Oscillations (PIO) in Digital Fly-By-Wire (DFBW) aircrafts. The proposed system consists of a Modified Model Reference Adaptive Control (M-MRAC) integrated with the Gain Scheduling technique. The PIO oscillations are detected using a Real Time Oscillation Verifier (ROVER) algorithm, which then enables the system to switch between two reference models; one in PIO condition, with low proneness to the phenomenon and another one in normal condition, with high (or medium) proneness. The reference models are defined in a closed loop condition using the Linear Quadratic Regulator (LQR) control methodology for Multiple-Input-Multiple-Output (MIMO) systems. The implemented algorithms are simulated in software implementations with state space models and commercial flight simulators as the controlled elements and with pilot dynamics models. A sequence of pitch angles is considered as the reference signal, named as Synthetic Task (Syntask), which must be tracked by the pilot models. The initial outcomes show that the proposed system can detect and suppress (or mitigate) the PIO oscillations in real time before it reaches high amplitudes.

Keywords: adaptive control, digital Fly-By-Wire, oscillations suppression, PIO

Procedia PDF Downloads 134
9129 The Use of AI to Measure Gross National Happiness

Authors: Riona Dighe

Abstract:

This research attempts to identify an alternative approach to the measurement of Gross National Happiness (GNH). It uses artificial intelligence (AI), incorporating natural language processing (NLP) and sentiment analysis to measure GNH. We use ‘off the shelf’ NLP models responsible for the sentiment analysis of a sentence as a building block for this research. We constructed an algorithm using NLP models to derive a sentiment analysis score against sentences. This was then tested against a sample of 20 respondents to derive a sentiment analysis score. The scores generated resembled human responses. By utilising the MLP classifier, decision tree, linear model, and K-nearest neighbors, we were able to obtain a test accuracy of 89.97%, 54.63%, 52.13%, and 47.9%, respectively. This gave us the confidence to use the NLP models against sentences in websites to measure the GNH of a country.

Keywords: artificial intelligence, NLP, sentiment analysis, gross national happiness

Procedia PDF Downloads 119
9128 Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks

Authors: Fazıl Gökgöz, Fahrettin Filiz

Abstract:

Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches.

Keywords: deep learning, long short term memory, energy, renewable energy load forecasting

Procedia PDF Downloads 266
9127 Treatment of Poultry Slaughterhouse Wastewater by Mesophilic Static Granular Bed Reactor (SGBR) Coupled with UF Membrane

Authors: Moses Basitere, Marshal Sherene Sheldon, Seteno Karabo Obed Ntwampe, Debbie Dejager

Abstract:

In South Africa, Poultry slaughterhouses consume largest amount of freshwater and discharges high strength wastewater, which can be treated successfully at low cost using anaerobic digesters. In this study, the performance of bench-scale mesophilic Static Granular Bed Reactor (SGBR) containing fully anaerobic granules coupled with ultra-filtration (UF) membrane as a post-treatment for poultry slaughterhouse wastewater was investigated. The poultry slaughterhouse was characterized by chemical oxygen demand (COD) range between 2000 and 6000 mg/l, average biological oxygen demand (BOD) of 2375 mg/l and average fats, oil and grease (FOG) of 554 mg/l. A continuous SGBR anaerobic reactor was operated for 6 weeks at different hydraulic retention time (HRT) and an Organic loading rate. The results showed an average COD removal was greater than 90% for both the SGBR anaerobic digester and ultrafiltration membrane. The total suspended solids and fats oil and grease (FOG) removal was greater than 95%. The SGBR reactor coupled with UF membrane showed a greater potential to treat poultry slaughterhouse wastewater.

Keywords: chemical oxygen demand, poultry slaughterhouse wastewater, static granular bed reactor, ultrafiltration, wastewater

Procedia PDF Downloads 387
9126 Predict Suspended Sediment Concentration Using Artificial Neural Networks Technique: Case Study Oued El Abiod Watershed, Algeria

Authors: Adel Bougamouza, Boualam Remini, Abd El Hadi Ammari, Feteh Sakhraoui

Abstract:

The assessment of sediments being carried by a river is importance for planning and designing of various water resources projects. In this study, Artificial Neural Network Techniques are used to estimate the daily suspended sediment concentration for the corresponding daily discharge flow in the upstream of Foum El Gherza dam, Biskra, Algeria. The FFNN, GRNN, and RBNN models are established for estimating current suspended sediment values. Some statistics involving RMSE and R2 were used to evaluate the performance of applied models. The comparison of three AI models showed that the RBNN model performed better than the FFNN and GRNN models with R2 = 0.967 and RMSE= 5.313 mg/l. Therefore, the ANN model had capability to improve nonlinear relationships between discharge flow and suspended sediment with reasonable precision.

Keywords: artificial neural network, Oued Abiod watershed, feedforward network, generalized regression network, radial basis network, sediment concentration

Procedia PDF Downloads 418
9125 Kinetic Façade Design Using 3D Scanning to Convert Physical Models into Digital Models

Authors: Do-Jin Jang, Sung-Ah Kim

Abstract:

In designing a kinetic façade, it is hard for the designer to make digital models due to its complex geometry with motion. This paper aims to present a methodology of converting a point cloud of a physical model into a single digital model with a certain topology and motion. The method uses a Microsoft Kinect sensor, and color markers were defined and applied to three paper folding-inspired designs. Although the resulted digital model cannot represent the whole folding range of the physical model, the method supports the designer to conduct a performance-oriented design process with the rough physical model in the reduced folding range.

Keywords: design media, kinetic facades, tangible user interface, 3D scanning

Procedia PDF Downloads 413
9124 Animal Modes of Surgical or Other External Causes of Trauma Wound Infection

Authors: Ojoniyi Oluwafeyekikunmi Okiki

Abstract:

Notwithstanding advances in disturbing wound care and control, infections remain a main motive of mortality, morbidity, and financial disruption in tens of millions of wound sufferers around the sector. Animal models have become popular gear for analyzing a big selection of outside worrying wound infections and trying out new antimicrobial techniques. This evaluation covers experimental infections in animal models of surgical wounds, pores and skin abrasions, burns, lacerations, excisional wounds, and open fractures. Animal modes of external stressful wound infections stated via extraordinary investigators vary in animal species used, microorganism traces, the quantity of microorganisms carried out, the dimensions of the wounds, and, for burn infections, the period of time the heated object or liquid is in contact with the skin. As antibiotic resistance continues to grow, new antimicrobial procedures are urgently needed. Those have to be examined using popular protocols for infections in external stressful wounds in animal models.

Keywords: surgical wounds, animals, wound infections, burns, wound models, colony-forming gadgets, lacerated wounds

Procedia PDF Downloads 8
9123 A Framework for Auditing Multilevel Models Using Explainability Methods

Authors: Debarati Bhaumik, Diptish Dey

Abstract:

Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.

Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics

Procedia PDF Downloads 95
9122 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity

Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu

Abstract:

Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.

Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity

Procedia PDF Downloads 201
9121 Predictive Models for Compressive Strength of High Performance Fly Ash Cement Concrete for Pavements

Authors: S. M. Gupta, Vanita Aggarwal, Som Nath Sachdeva

Abstract:

The work reported through this paper is an experimental work conducted on High Performance Concrete (HPC) with super plasticizer with the aim to develop some models suitable for prediction of compressive strength of HPC mixes. In this study, the effect of varying proportions of fly ash (0% to 50% at 10% increment) on compressive strength of high performance concrete has been evaluated. The mix designs studied were M30, M40 and M50 to compare the effect of fly ash addition on the properties of these concrete mixes. In all eighteen concrete mixes have been designed, three as conventional concretes for three grades under discussion and fifteen as HPC with fly ash with varying percentages of fly ash. The concrete mix designing has been done in accordance with Indian standard recommended guidelines i.e. IS: 10262. All the concrete mixes have been studied in terms of compressive strength at 7 days, 28 days, 90 days and 365 days. All the materials used have been kept same throughout the study to get a perfect comparison of values of results. The models for compressive strength prediction have been developed using Linear Regression method (LR), Artificial Neural Network (ANN) and Leave One Out Validation (LOOV) methods.

Keywords: high performance concrete, fly ash, concrete mixes, compressive strength, strength prediction models, linear regression, ANN

Procedia PDF Downloads 445
9120 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures

Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman

Abstract:

Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.

Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction

Procedia PDF Downloads 48
9119 Domain specific Ontology-Based Knowledge Extraction Using R-GNN and Large Language Models

Authors: Andrey Khalov

Abstract:

The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.

Keywords: ontology mapping, R-GNN, knowledge extraction, large language models, NER, knowlege graph

Procedia PDF Downloads 16
9118 Exploring the Impacts of Field of View on 3D Game Experiences and Task Performances

Authors: Jiunde Lee, Meng-Yu Wun

Abstract:

The present study attempted to explore how the range differences of ‘Geometric Field of Vision’ (GFOV) and differences in camera control in 3D simulation games, OMSI—The Bus Simulator of the 2013 PC version, affected players’ cognitive load, anxiety, and task performances. The study employed a between-subjects factorial experimental design. A total of 80 subjects completed experiment whose data were eligible for further analysis. The results of this study showed that in the difference of field of view, players had better task performances in a spacious view. Although cognitive resources consumed more of the players’ ‘mental demand,’ ‘physical demand’, and ‘temporal demand’, they had better performances in the experiment, and their anxiety was effectively reduced. On the other hand, in the narrow GFOV, players thought they spent more cognitive resources on ‘effort’ and ‘frustration degree,’ and had worse task performances, but it was not significant enough to reduce their anxiety. In terms of difference of camera control, players had worse performances since the fixed lens restricted their dexterous control. However, there was no significant difference in the players’ subjective cognitive resources or anxiety. The results further illustrated that task performances were affected by the interaction of GFOV and camera control.

Keywords: geometric field of view, camera lens, cognitive load, anxiety

Procedia PDF Downloads 149
9117 Circular Economy Maturity Models: A Systematic Literature Review

Authors: Dennis Kreutzer, Sarah Müller-Abdelrazeq, Ingrid Isenhardt

Abstract:

Resource scarcity, energy transition and the planned climate neutrality pose enormous challenges for manufacturing companies. In order to achieve these goals and a holistic sustainable development, the European Union has listed the circular economy as part of the Circular Economy Action Plan. In addition to a reduction in resource consumption, reduced emissions of greenhouse gases and a reduced volume of waste, the principles of the circular economy also offer enormous economic potential for companies, such as the generation of new circular business models. However, many manufacturing companies, especially small and medium-sized enterprises, do not have the necessary capacity to plan their transformation. They need support and strategies on the path to circular transformation, because this change affects not only production but also the entire company. Maturity models offer an approach, as they enable companies to determine the current status of their transformation processes. In addition, companies can use the models to identify transformation strategies and thus promote the transformation process. While maturity models are established in other areas, e.g. IT or project management, only a few circular economy maturity models can be found in the scientific literature. The aim of this paper is to analyse the identified maturity models of the circular economy through a systematic literature review (SLR) and, besides other aspects, to check their completeness as well as their quality. Since the terms "maturity model" and "readiness model" are often used to assess the transformation process, this paper considers both types of models to provide a more comprehensive result. For this purpose, circular economy maturity models at the company (micro) level were identified from the literature, compared, and analysed with regard to their theoretical and methodological structure. A specific focus was placed, on the one hand, on the analysis of the business units considered in the respective models and, on the other hand, on the underlying metrics and indicators in order to determine the individual maturity level of the entire company. The results of the literature review show, for instance, a significant difference in the holism of their assessment framework. Only a few models include the entire company with supporting areas outside the value-creating core process, e.g. strategy and vision. Additionally, there are large differences in the number and type of indicators as well as their metrics. For example, most models often use subjective indicators and very few objective indicators in their surveys. It was also found that there are rarely well-founded thresholds between the levels. Based on the generated results, concrete ideas and proposals for a research agenda in the field of circular economy maturity models are made.

Keywords: maturity model, circular economy, transformation, metric, assessment

Procedia PDF Downloads 114
9116 Accuracy of Peak Demand Estimates for Office Buildings Using Quick Energy Simulation Tool

Authors: Mahdiyeh Zafaranchi, Ethan S. Cantor, William T. Riddell, Jess W. Everett

Abstract:

The New Jersey Department of Military and Veteran’s Affairs (NJ DMAVA) operates over 50 facilities throughout the state of New Jersey, U.S. NJDMAVA is under a mandate to move toward decarbonization, which will eventually include eliminating the use of natural gas and other fossil fuels for heating. At the same time, the organization requires increased resiliency regarding electric grid disruption. These competing goals necessitate adopting the use of on-site renewables such as photovoltaic and geothermal power, as well as implementing power control strategies through microgrids. Planning for these changes requires a detailed understanding of current and future electricity use on yearly, monthly, and shorter time scales, as well as a breakdown of consumption by heating, ventilation, and air conditioning (HVAC) equipment. This paper discusses case studies of two buildings that were simulated using the QUick Energy Simulation Tool (eQUEST). Both buildings use electricity from the grid and photovoltaics. One building also uses natural gas. While electricity use data are available in hourly intervals and natural gas data are available in monthly intervals, the simulations were developed using monthly and yearly totals. This approach was chosen to reflect the information available for most NJ DMAVA facilities. Once completed, simulation results are compared to metrics recommended by several organizations to validate energy use simulations. In addition to yearly and monthly totals, the simulated peak demands are compared to actual monthly peak demand values. The simulations resulted in monthly peak demand values that were within 30% of the measured values. These benchmarks will help to assess future energy planning efforts for NJ DMAVA.

Keywords: building energy modeling, eQUEST, peak demand, smart meters

Procedia PDF Downloads 68
9115 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria

Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova

Abstract:

Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.

Keywords: cross-validation, decision tree, lagged variables, short-term forecasting

Procedia PDF Downloads 194
9114 Improving Sales through Inventory Reduction: A Retail Chain Case Study

Authors: M. G. Mattos, J. E. Pécora Jr, T. A. Briso

Abstract:

Today's challenging business environment, with unpredictable demand and volatility, requires a supply chain strategy that handles uncertainty and risks in the right way. Even though inventory models have been previously explored, this paper seeks to apply these concepts on a practical situation. This study involves the inventory replenishment problem, applying techniques that are mainly based on mathematical assumptions and modeling. The primary goal is to improve the retailer’s supply chain processes taking store differences when setting the various target stock levels. Through inventory review policy, picking piece implementation and minimum exposure definition, we were able not only to promote the inventory reduction as well as improve sales results. The inventory management theory from literature review was then tested on a single case study regarding a particular department in one of the largest Latam retail chains.

Keywords: inventory, distribution, retail, risk, safety stock, sales, uncertainty

Procedia PDF Downloads 268
9113 JaCoText: A Pretrained Model for Java Code-Text Generation

Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri

Abstract:

Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.

Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks

Procedia PDF Downloads 285
9112 Vulnerability of Steel Moment-Frame Buildings with Pinned and, Alternatively, with Semi-Rigid Connections

Authors: Daniel Llanes, Alfredo Reyes, Sonia E. Ruiz, Federico Valenzuela Beltran

Abstract:

Steel frames have been used in building construction for more than one hundred years. Beam-column may be connected to columns using either stiffened or unstiffened angles at the top and bottom beam flanges. Designers often assume that these assemblies acted as “pinned” connections for gravity loads and that the stiffened connections would act as “fixed” connections for lateral loads. Observation of damages sustained by buildings during the 1994 Northridge earthquake indicated that, contrary to the intended behavior, in many cases, brittle fractures initiated within the connections at very low levels of plastic demand, and in some cases, while the structures remained essentially elastic. Due to the damage presented in these buildings other type of alternative connections have been proposed. According to a research funded by the Federal Emergency Management Agency (FEMA), the screwed connections have better performance when they are subjected to cyclic loads, but at the same time, these connections have some degree of flexibility. Due to this situation, some researchers ventured into the study of semi-rigid connections. In the present study three steel buildings, constituted by regular frames are analyzed. Two types of connections are considered: pinned and semi-rigid connections. With the aim to estimate their structural capacity, a number of incremental dynamic analyzes are performed. 3D structural models are used for the analyses. The seismic ground motions were recorded on sites near Los Angeles, California, where the structures are supposed to be located. The vulnerability curves of the building are obtained in terms of maximum inter-story drifts. The vulnerability curves (which correspond to the models with two different types of connections) are compared, and its implications on its structural design and performance is discussed.

Keywords: steel frame Buildings, vulnerability curves, semi-rigid connections, pinned connections

Procedia PDF Downloads 225
9111 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm

Authors: Zachary Huffman, Joana Rocha

Abstract:

Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.

Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations

Procedia PDF Downloads 135
9110 Human Resource Utilization Models for Graceful Ageing

Authors: Chuang-Chun Chiou

Abstract:

In this study, a systematic framework of graceful ageing has been used to explore the possible human resource utilization models for graceful ageing purpose. This framework is based on the Chinese culture. We call ‘Nine-old’ target. They are ageing gracefully with feeding, accomplishment, usefulness, learning, entertainment, care, protection, dignity, and termination. This study is focused on two areas: accomplishment and usefulness. We exam the current practices of initiatives and laws of promoting labor participation. That is to focus on how to increase Labor Force Participation Rate of the middle aged as well as the elderly and try to promote the elderly to achieve graceful ageing. Then we present the possible models that support graceful ageing.

Keywords: human resource utilization model, labor participation, graceful ageing, employment

Procedia PDF Downloads 390
9109 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 69
9108 Environmental Modeling of Storm Water Channels

Authors: L. Grinis

Abstract:

Turbulent flow in complex geometries receives considerable attention due to its importance in many engineering applications. It has been the subject of interest for many researchers. Some of these interests include the design of storm water channels. The design of these channels requires testing through physical models. The main practical limitation of physical models is the so called “scale effect”, that is, the fact that in many cases only primary physical mechanisms can be correctly represented, while secondary mechanisms are often distorted. These observations form the basis of our study, which centered on problems associated with the design of storm water channels near the Dead Sea, in Israel. To help reach a final design decision we used different physical models. Our research showed good coincidence with the results of laboratory tests and theoretical calculations, and allowed us to study different effects of fluid flow in an open channel. We determined that problems of this nature cannot be solved only by means of theoretical calculation and computer simulation. This study demonstrates the use of physical models to help resolve very complicated problems of fluid flow through baffles and similar structures. The study applies these models and observations to different construction and multiphase water flows, among them, those that include sand and stone particles, a significant attempt to bring to the testing laboratory a closer association with reality.

Keywords: open channel, physical modeling, baffles, turbulent flow

Procedia PDF Downloads 284
9107 The Rise of Halal Banking and Financial Products in Post-Soviet Central Asia: A Study of Causative Factors

Authors: Bilal Ahmad Malik

Abstract:

With the fall of Soviet Union in 1991 the whole Central Asian region saw a dramatic rise in Muslim identity, a call back to Islamic legacy. Today, many Central Asian Muslims demand, what Islam has termed legal (Halal) and, avoid what Islam has termed illegal (Haram). The process of Islamic resurgence kicked off very quickly soon after the integration of Central Asian republics with other Muslim geographies through the membership of Organization of Islamic Conference (OIC) and other similar organizations. This interaction proved to be a vital push factor to the already existing indigenous reviving trends and sentiments. As a result, along with many other requirements, Muslim customer demand emerged as navel trend in the market in general and in banking and financial sector in particular. To get this demand fulfilled, the governments of CIS states like Kazakhstan, Uzbekistan, Azerbaijan, Turkmenistan, Kyrgyzstan and Tajikistan introduced Halal banking and financial products in the market. Firstly, the present paper would briefly discuss the core composition of Halal banking and financial products. Then, coming to its major theme, it would try to identify and analyze the causes that lead to the emergence of Islamic banking and finance industry in the Muslim majority Post-Soviet CIS States.

Keywords: causes, Central Asia, interest-free banking, Islamic Revival

Procedia PDF Downloads 399
9106 Application of the Least Squares Method in the Adjustment of Chlorodifluoromethane (HCFC-142b) Regression Models

Authors: L. J. de Bessa Neto, V. S. Filho, J. V. Ferreira Nunes, G. C. Bergamo

Abstract:

There are many situations in which human activities have significant effects on the environment. Damage to the ozone layer is one of them. The objective of this work is to use the Least Squares Method, considering the linear, exponential, logarithmic, power and polynomial models of the second degree, to analyze through the coefficient of determination (R²), which model best fits the behavior of the chlorodifluoromethane (HCFC-142b) in parts per trillion between 1992 and 2018, as well as estimates of future concentrations between 5 and 10 periods, i.e. the concentration of this pollutant in the years 2023 and 2028 in each of the adjustments. A total of 809 observations of the concentration of HCFC-142b in one of the monitoring stations of gases precursors of the deterioration of the ozone layer during the period of time studied were selected and, using these data, the statistical software Excel was used for make the scatter plots of each of the adjustment models. With the development of the present study, it was observed that the logarithmic fit was the model that best fit the data set, since besides having a significant R² its adjusted curve was compatible with the natural trend curve of the phenomenon.

Keywords: chlorodifluoromethane (HCFC-142b), ozone, least squares method, regression models

Procedia PDF Downloads 124
9105 The Potential of On-Demand Shuttle Services to Reduce Private Car Use

Authors: B. Mack, K. Tampe-Mai, E. Diesch

Abstract:

Findings of an ongoing discrete choice study of future transport mode choice will be presented. Many urban centers face the triple challenge of having to cope with ever increasing traffic congestion, environmental pollution, and greenhouse gas emission brought about by private car use. In principle, private car use may be diminished by extending public transport systems like bus lines, trams, tubes, and trains. However, there are limits to increasing the (perceived) spatial and temporal flexibility and reducing peak-time crowding of classical public transport systems. An emerging new type of system, publicly or privately operated on-demand shuttle bus services, seem suitable to ameliorate the situation. A fleet of on-demand shuttle busses operates without fixed stops and schedules. It may be deployed efficiently in that each bus picks up passengers whose itineraries may be combined into an optimized route. Crowding may be minimized by limiting the number of seats and the inter-seat distance for each bus. The study is conducted as a discrete choice experiment. The choice between private car, public transport, and shuttle service is registered as a function of several push and pull factors (financial costs, travel time, walking distances, mobility tax/congestion charge, and waiting time/parking space search time). After the completion of the discrete choice items, the study participant is asked to rate the three modes of transport with regard to the pull factors of comfort, safety, privacy, and opportunity to engage in activities like reading or surfing the internet. These ratings are entered as additional predictors into the discrete choice experiment regression model. The study is conducted in the region of Stuttgart in southern Germany. N=1000 participants are being recruited. Participants are between 18 and 69 years of age, hold a driver’s license, and live in the city or the surrounding region of Stuttgart. In the discrete choice experiment, participants are asked to assume they lived within the Stuttgart region, but outside of the city, and were planning the journey from their apartment to their place of work, training, or education during the peak traffic time in the morning. Then, for each item of the discrete choice experiment, they are asked to choose between the transport modes of private car, public transport, and on-demand shuttle in the light of particular values of the push and pull factors studied. The study will provide valuable information on the potential of switching from private car use to the use of on-demand shuttles, but also on the less desirable potential of switching from public transport to on-demand shuttle services. Furthermore, information will be provided on the modulation of these switching potentials by pull and push factors.

Keywords: determinants of travel mode choice, on-demand shuttle services, private car use, public transport

Procedia PDF Downloads 183
9104 Production Optimization under Geological Uncertainty Using Distance-Based Clustering

Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe

Abstract:

It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.

Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization

Procedia PDF Downloads 144
9103 Designing a Model to Increase the Flow of Circular Economy Startups Using a Systemic and Multi-Generational Approach

Authors: Luís Marques, João Rocha, Andreia Fernandes, Maria Moura, Cláudia Caseiro, Filipa Figueiredo, João Nunes

Abstract:

The implementation of circularity strategies other than recycling, such as reducing the amount of raw material, as well as reusing or sharing existing products, remains marginal. The European Commission announced that the transition towards a more circular economy could lead to the net creation of about 700,000 jobs in Europe by 2030, through additional labour demand from recycling plants, repair services and other circular activities. Efforts to create new circular business models in accordance with completely circular processes, as opposed to linear ones, have increased considerably in recent years. In order to create a societal Circular Economy transition model, it is necessary to include innovative solutions, where startups play a key role. Early-stage startups based on new business models according to circular processes often face difficulties in creating enough impact. The StartUp Zero Program designs a model and approach to increase the flow of startups in the Circular Economy field, focusing on a systemic decision analysis and multi-generational approach, considering Multi-Criteria Decision Analysis to support a decision-making tool, which is also supported by the use of a combination of an Analytical Hierarchy Process and Multi-Attribute Value Theory methods. We define principles, criteria and indicators for evaluating startup prerogatives, quantifying the evaluation process in a unique result. Additionally, this entrepreneurship program spanning 16 months involved more than 2400 young people, from ages 14 to 23, in more than 200 interaction activities.

Keywords: circular economy, entrepreneurship, startups;, multi-criteria decision analysis

Procedia PDF Downloads 105