Search results for: estimating of trajectory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1112

Search results for: estimating of trajectory

542 Estimating Future Solar Potential in Evolving High-Density Urban Areas for the Mid-Latitude City of Mendoza, Argentina

Authors: Mariela Edith Arboit

Abstract:

The main goal of the project is to explore the evolution possibilities of the morphological indicators of the built environment, including those resulting from progressive soil occupation, due to the relentless growth of the city’s population and subsequent increase in building density and solar access reduction per built unit. Two alternative normative proposals, Conventional Proposal (CP) and Alternative Proposal (AP), are compared. In addition, temporal scenarios of the city’s evolution process are analyzed, starting from the reference situation of existing, high-density built-up areas, and simulating their possible morphological outcomes on theoretical medium (30 yr.) and long (60 yr.) terms, as a result of the massive implementation of either regulation in the long run. The results obtained demonstrate that the Alternative Proposal (AP) presents higher mean values of predicted solar potential expressed by the Volumetric Insolation Factor total (VIFtot) for both time periods and services. Regarding environmental aspects, the different impacts of either alternative on the urban landscape quality seem to favor the AP proposal. Its deserved detailed assessment is also presently being developed through a quanti-qualitative methodology.

Keywords: building morphology, environmental quality, solar energy, urban sustainability

Procedia PDF Downloads 157
541 Operationalizing the Concept of Community Resilience through Community Capitals Framework-Based Index

Authors: Warda Ajaz

Abstract:

This study uses the ‘Community Capitals Framework’ (CCF) to develop a community resilience index that can serve as a useful tool for measuring resilience of communities in diverse contexts and backgrounds. CCF is an important analytical tool to assess holistic community change. This framework identifies seven major types of community capitals: natural, cultural, human, social, political, financial and built, and claims that the communities that have been successful in supporting healthy sustainable community and economic development have paid attention to all these capitals. The framework, therefore, proposes to study the community development through identification of assets in these major capitals (stock), investment in these capitals (flow), and the interaction between these capitals. Capital based approaches have been extensively used to assess community resilience, especially in the context of natural disasters and extreme events. Therefore, this study identifies key indicators for estimating each of the seven capitals through an extensive literature review and then develops an index to calculate a community resilience score. The CCF-based community resilience index presents an innovative way of operationalizing the concept of community resilience and will contribute toward decision-relevant research regarding adaptation and mitigation of community vulnerabilities to climate change-induced, as well as other adverse events.

Keywords: adverse events, community capitals, community resilience, climate change, economic development, sustainability

Procedia PDF Downloads 267
540 Chatter Prediction of Curved Thin-walled Parts Considering Variation of Dynamic Characteristics Based on Acoustic Signals Acquisition

Authors: Damous Mohamed, Zeroudi Nasredine

Abstract:

High-speed milling of thin-walled parts with complex curvilinear profiles often encounters machining instability, commonly referred to as chatter. This phenomenon arises due to the dynamic interaction between the cutting tool and the part, exacerbated by the part's low rigidity and varying dynamic characteristics along the tool path. This research presents a dynamic model specifically developed to predict machining stability for such curved thin-walled components. The model employs the semi-discretization method, segmenting the tool trajectory into small, straight elements to locally approximate the behavior of an inclined plane. Dynamic characteristics for each segment are extracted through experimental modal analysis and incorporated into the simulation model to generate global stability lobe diagrams. Validation of the model is conducted through cutting tests where acoustic intensity is measured to detect instabilities. The experimental data align closely with the predicted stability limits, confirming the model's accuracy and effectiveness. This work provides a comprehensive approach to enhancing machining stability predictions, thereby improving the efficiency and quality of high-speed milling operations for thin-walled parts.

Keywords: chatter, curved thin-walled part, semi-discretization method, stability lobe diagrams

Procedia PDF Downloads 26
539 Evaluation of the Impact of Information and Communications Technology (ICT) on the Accuracy of Preliminary Cost Estimates of Building Projects in Nigeria

Authors: Nofiu A. Musa, Olubola Babalola

Abstract:

The study explored the effect of ICT on the accuracy of Preliminary Cost Estimates (PCEs) prepared by quantity surveying consulting firms in Nigeria for building projects, with a view to determining the desirability of the adoption and use of the technological innovation for preliminary estimating. Thus, data pertinent to the study were obtained through questionnaire survey conducted on a sample of one hundred and eight (108) quantity surveying firms selected from the list of registered firms compiled by the Nigerian Institute of Quantity Surveyors (NIQS), Lagos State Chapter through systematic random sampling. The data obtained were analyzed with SPSS version 17 using student’s t-tests at 5% significance level. The results obtained revealed that the mean bias and co-efficient of variation of the PCEs of the firms are significantly less at post ICT adoption period than the pre ICT adoption period, F < 0.05 in each case. The paper concluded that the adoption and use of the Technological Innovation (ICT) has significantly improved the accuracy of the Preliminary Cost Estimates (PCEs) of building projects, hence, it is desirable.

Keywords: accepted tender price, accuracy, bias, building projects, consistency, information and communications technology, preliminary cost estimates

Procedia PDF Downloads 428
538 From Ride-Hailing App to Diversified and Sustainable Platform Business Model

Authors: Ridwan Dewayanto Rusli

Abstract:

We show how prisoner's dilemma-type competition problems can be mitigated through rapid platform diversification and ecosystem expansion. We analyze a ride-hailing company in Southeast Asia, Gojek, whose network grew to more than 170 million users comprising consumers, partner drivers, merchants, and complementors within a few years and has already achieved higher contribution margins than ride-hailing peers Uber and Lyft. Its ecosystem integrates ride-hailing, food delivery and logistics, merchant solutions, e-commerce, marketplace and advertising, payments, and fintech offerings. The company continues growing its network of complementors and App developers, expanding content and gaining critical mass in consumer data analytics and advertising. We compare the company's growth and diversification trajectory with those of its main international rivals and peers. The company's rapid growth and future potential are analyzed using Cusumano's (2012) Staying Power and Six Principles, Hax and Wilde's (2003) and Hax's (2010) The Delta Model as well as Santos' (2016) home-market advantages frameworks. The recently announced multi-billion-dollar merger with one of Southeast Asia's largest e-commerce majors lends additional support to the above arguments.

Keywords: ride-hailing, prisoner's dilemma, platform and ecosystem strategy, digital applications, diversification, home market advantages, e-commerce

Procedia PDF Downloads 93
537 Estimating View-Through Ad Attribution from User Surveys Using Convex Optimization

Authors: Yuhan Lin, Rohan Kekatpure, Cassidy Yeung

Abstract:

In Digital Marketing, robust quantification of View-through attribution (VTA) is necessary for evaluating channel effectiveness. VTA occurs when a product purchase is aided by an Ad but without an explicit click (e.g. a TV ad). A lack of a tracking mechanism makes VTA estimation challenging. Most prevalent VTA estimation techniques rely on post-purchase in-product user surveys. User surveys enable the calculation of channel multipliers, which are the ratio of the view-attributed to the click-attributed purchases of each marketing channel. Channel multipliers thus provide a way to estimate the unknown VTA for a channel from its known click attribution. In this work, we use Convex Optimization to compute channel multipliers in a way that enables a mathematical encoding of the expected channel behavior. Large fluctuations in channel attributions often result from overfitting the calculations to user surveys. Casting channel attribution as a Convex Optimization problem allows an introduction of constraints that limit such fluctuations. The result of our study is a distribution of channel multipliers across the entire marketing funnel, with important implications for marketing spend optimization. Our technique can be broadly applied to estimate Ad effectiveness in a privacy-centric world that increasingly limits user tracking.

Keywords: digital marketing, survey analysis, operational research, convex optimization, channel attribution

Procedia PDF Downloads 199
536 An Engineered Epidemic: Big Pharma's Role in the Opioid Crisis

Authors: Donna L. Roberts

Abstract:

2019 marked 23 years since Purdue Pharma launched its flagship drug, OxyContin, that unleashed an unprecedented epidemic touching both celebrities and common citizens, metropolitan, suburbia and rural areas and all levels of socioeconomic status. From rural Appalachia to East LA individuals, families and communities have been devastated by a trajectory of addiction that often began with the legitimate prescription of a pain killer for anything from a tooth extraction to a sports injury to recovery from surgery or chronic arthritis. Far from being a serendipitous progression of events, the proliferation of this new breed of 'miracle drug' was instead a carefully crafted marketing program aimed at both the medical community and common citizens. This research represents and in-depth investigation of the evolution of the marketing, distribution and promotion of prescription opioids by pharmaceutical companies and its relationship to the propagation of the opioid crisis. Specifically, key components of Purdue Pharma’s aggressive marketing campaign, including its bonus system and sales incentives, were analyzed in the context of the sociopolitical environment that essential created the proverbial 'perfect storm' for the changing manner in which pain is treated in the U.S. The analyses of these series of events clearly indicate their role in first, the increase in prescription of opioids for non-terminal pain relief and subsequently, the incidence of related addiction, overdose, and death. Through this examination of the conditions that facilitated and maintained this drug crisis, perhaps we can begin to chart a course toward its resolution.

Keywords: addiction, opioid, opioid crisis, Purdue Pharma

Procedia PDF Downloads 121
535 Morphological Transformations and Variations in Architectural Language from Tombs to Mausoleums: From Ottoman Empire to the Turkish Republic

Authors: Uğur Tuztaşi, Mehmet Uysal, Yavuz Arat

Abstract:

The tomb (grave) structures that have influenced the architectural culture from the Seljuk times to the Ottoman throughout Anatolia are members of a continuing building tradition in terms of monumental expression and styles. This building typology which has religious and cultural permeability in view of spatial traces and structural formations follows the entire trajectory of the respect to death and the deceased from the Seljuks to the Ottomans and also the changing burial traditions epitomised in the form of mausoleums in the Turkish Republic. Although the cultural layers have the same contents with regards to the cult of monument this architectural tradition which evolved from tombs to mausoleums changed in both typological formation and structural size. In short, the tomb tradition with unique examples of architectural functions and typological formations has been encountered from 13th century onwards and continued during the Ottoman period with changes in form and has transformed to mausoleums during the 20th century. This study analyses the process of transformation from complex structures to simple structures and then to monumental graves in terms of architectural expression. Moreover, the study interrogates the architectural language of Anatolian Seljuk tombs to Ottoman tombs and monumental graves built during the republican period in terms of spatial and structural contexts.

Keywords: death and space in Turks, monumental graves, language of architectural style, morphological transformations

Procedia PDF Downloads 356
534 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 90
533 Analysis of Ionosphere Anomaly Before Great Earthquake in Java on 2009 Using GPS Tec Data

Authors: Aldilla Damayanti Purnama Ratri, Hendri Subakti, Buldan Muslim

Abstract:

Ionosphere’s anomalies as an effect of earthquake activity is a phenomenon that is now being studied in seismo-ionospheric coupling. Generally, variation in the ionosphere caused by earthquake activity is weaker than the interference generated by different source, such as geomagnetic storms. However, disturbances of geomagnetic storms show a more global behavior, while the seismo-ionospheric anomalies occur only locally in the area which is largely determined by magnitude of the earthquake. It show that the earthquake activity is unique and because of its uniqueness it has been much research done thus expected to give clues as early warning before earthquake. One of the research that has been developed at this time is the approach of seismo-ionospheric-coupling. This study related the state in the lithosphere-atmosphere and ionosphere before and when earthquake occur. This paper choose the total electron content in a vertical (VTEC) in the ionosphere as a parameter. Total Electron Content (TEC) is defined as the amount of electron in vertical column (cylinder) with cross-section of 1m2 along GPS signal trajectory in ionosphere at around 350 km of height. Based on the analysis of data obtained from the LAPAN agency to identify abnormal signals by statistical methods, obtained that there are an anomaly in the ionosphere is characterized by decreasing of electron content of the ionosphere at 1 TECU before the earthquake occurred. Decreasing of VTEC is not associated with magnetic storm that is indicated as an earthquake precursor. This is supported by the Dst index showed no magnetic interference.

Keywords: earthquake, DST Index, ionosphere, seismoionospheric coupling, VTEC

Procedia PDF Downloads 586
532 Optimal 3D Deployment and Path Planning of Multiple Uavs for Maximum Coverage and Autonomy

Authors: Indu Chandran, Shubham Sharma, Rohan Mehta, Vipin Kizheppatt

Abstract:

Unmanned aerial vehicles are increasingly being explored as the most promising solution to disaster monitoring, assessment, and recovery. Current relief operations heavily rely on intelligent robot swarms to capture the damage caused, provide timely rescue, and create road maps for the victims. To perform these time-critical missions, efficient path planning that ensures quick coverage of the area is vital. This study aims to develop a technically balanced approach to provide maximum coverage of the affected area in a minimum time using the optimal number of UAVs. A coverage trajectory is designed through area decomposition and task assignment. To perform efficient and autonomous coverage mission, solution to a TSP-based optimization problem using meta-heuristic approaches is designed to allocate waypoints to the UAVs of different flight capacities. The study exploits multi-agent simulations like PX4-SITL and QGroundcontrol through the ROS framework and visualizes the dynamics of UAV deployment to different search paths in a 3D Gazebo environment. Through detailed theoretical analysis and simulation tests, we illustrate the optimality and efficiency of the proposed methodologies.

Keywords: area coverage, coverage path planning, heuristic algorithm, mission monitoring, optimization, task assignment, unmanned aerial vehicles

Procedia PDF Downloads 215
531 Improving Access to Palliative Care for Heart Failure Patients in England Using a Health Systems Approach

Authors: Alex Hughes

Abstract:

Patients with advanced heart failure develop specific palliative care needs due to the progressive symptom burden and unpredictable disease trajectory. NICE guidance advises that palliative care should be provided to patients with both cancer and non-cancer conditions as and when required. However, there is some way to go before this guidance is consistently and effectively implemented nationwide in conditions such as heart failure. The Ambitions for Palliative and End of Life Care: A national framework for local action in England provides a set of foundations and ambitions which outline a vision for what high-quality palliative and end-of-life care look like in England. This poster aims to critically consider how to improve access to palliative care for heart failure patients in England by analysing the foundations taken from this framework to generate specific recommendations using Soft Systems Methodology (SSM). The eight foundations analysed are: ‘Personalised care planning’, ‘Shared records’, ‘Evidence and information’, ‘Involving, supporting and caring for those important to the dying Person’, ‘Education and training’, ‘24/7 access’, ‘Co-design’ and ‘Leadership.’ A number of specific recommendations have been generated which highlight a need to close the evidence-policy gap and implement policy with sufficient evidence. These recommendations, alongside the creation of an evidence-based national strategy for palliative care and heart failure, should improve access to palliative care for heart failure patients in England. Once implemented, it will be necessary to evaluate the effect of these proposals to understand if access to palliative care for heart failure patients actually improves.

Keywords: access, health systems, heart failure, palliative care

Procedia PDF Downloads 128
530 The Prognostic Value of Dynamic Changes of Hematological Indices in Oropharyngeal Cancer Patients Treated with Radiotherapy

Authors: Yao Song, Danni Cheng, Jianjun Ren

Abstract:

Objectives: We aimed to explore the prognostic effects of absolute values and dynamic changes of common hematological indices on oropharynx squamous cell carcinoma (OPSCC) patients treated with radiation. Methods and materials: The absolute values of white blood cell (WBC), absolute neutrophil count (ANC), absolute lymphocyte count (ALC), hemoglobin (Hb), platelet (Plt), albumin (Alb), neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) at baseline (within 45 days before radiation), 1-, 3-, 6- and 12-months after the start of radiotherapy were retrospectively collected. Locally-estimated smoothing scatterplots were used to describe the smooth trajectory of each index. A mixed-effect model with a random slope was fitted to describe the changing rate and trend of indices over time. Cox proportional hazard analysis was conducted to assess the correlation between hematological indices and treatment outcomes. Results: Of the enrolled 85 OPSCC patients, inflammatory indices, such as WBC and ALC, dropped rapidly during acute treatment and gradually recovered, while NLR and PLR increased at first three months and subsequently declined within 3-12 months. Higher absolute value or increasing trend of nutritional indices (Alb and Hb) was associated with better prognosis (all p<0.05). In contrast, patients with higher absolute value or upward trend of inflammatory indices (WBC, ANC, Plt, PLR and NLR) had worse survival (all p<0.05). Conclusions: The absolute values and dynamic changes of hematological indices were valuable prognostic factors for OPSCC patients who underwent radiotherapy.

Keywords: hematological indices, oropharyngeal cancer, radiotherapy, NLR, PLR

Procedia PDF Downloads 183
529 Design and Analysis of Adaptive Type-I Progressive Hybrid Censoring Plan under Step Stress Partially Accelerated Life Testing Using Competing Risk

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

Statistical distributions have long been employed in the assessment of semiconductor devices and product reliability. The power function-distribution is one of the most important distributions in the modern reliability practice and can be frequently preferred over mathematically more complex distributions, such as the Weibull and the lognormal, because of its simplicity. Moreover, it may exhibit a better fit for failure data and provide more appropriate information about reliability and hazard rates in some circumstances. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests for competing risk based on adoptive type-I progressive hybrid censoring criteria. The life data of the units under test is assumed to follow Mukherjee-Islam distribution. The point and interval maximum-likelihood estimations are obtained for distribution parameters and tampering coefficient. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.

Keywords: adoptive progressive hybrid censoring, competing risk, mukherjee-islam distribution, partially accelerated life testing, simulation study

Procedia PDF Downloads 347
528 Design and Implementation of PD-NN Controller Optimized Neural Networks for a Quad-Rotor

Authors: Chiraz Ben Jabeur, Hassene Seddik

Abstract:

In this paper, a full approach of modeling and control of a four-rotor unmanned air vehicle (UAV), known as quad-rotor aircraft, is presented. In fact, a PD and a PD optimized Neural Networks Approaches (PD-NN) are developed to be applied to control a quad-rotor. The goal of this work is to concept a smart self-tuning PD controller based on neural networks able to supervise the quad-rotor for an optimized behavior while tracking the desired trajectory. Many challenges could arise if the quad-rotor is navigating in hostile environments presenting irregular disturbances in the form of wind added to the model on each axis. Thus, the quad-rotor is subject to three-dimensional unknown static/varying wind disturbances. The quad-rotor has to quickly perform tasks while ensuring stability and accuracy and must behave rapidly with regard to decision-making facing disturbances. This technique offers some advantages over conventional control methods such as PD controller. Simulation results are obtained with the use of Matlab/Simulink environment and are founded on a comparative study between PD and PD-NN controllers based on wind disturbances. These later are applied with several degrees of strength to test the quad-rotor behavior. These simulation results are satisfactory and have demonstrated the effectiveness of the proposed PD-NN approach. In fact, this controller has relatively smaller errors than the PD controller and has a better capability to reject disturbances. In addition, it has proven to be highly robust and efficient, facing turbulences in the form of wind disturbances.

Keywords: hostile environment, PD and PD-NN controllers, quad-rotor control, robustness against disturbance

Procedia PDF Downloads 136
527 Estimation of Transition and Emission Probabilities

Authors: Aakansha Gupta, Neha Vadnere, Tapasvi Soni, M. Anbarsi

Abstract:

Protein secondary structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine and biotechnology. Some aspects of protein functions and genome analysis can be predicted by secondary structure prediction. This is used to help annotate sequences, classify proteins, identify domains, and recognize functional motifs. In this paper, we represent protein secondary structure as a mathematical model. To extract and predict the protein secondary structure from the primary structure, we require a set of parameters. Any constants appearing in the model are specified by these parameters, which also provide a mechanism for efficient and accurate use of data. To estimate these model parameters there are many algorithms out of which the most popular one is the EM algorithm or called the Expectation Maximization Algorithm. These model parameters are estimated with the use of protein datasets like RS126 by using the Bayesian Probabilistic method (data set being categorical). This paper can then be extended into comparing the efficiency of EM algorithm to the other algorithms for estimating the model parameters, which will in turn lead to an efficient component for the Protein Secondary Structure Prediction. Further this paper provides a scope to use these parameters for predicting secondary structure of proteins using machine learning techniques like neural networks and fuzzy logic. The ultimate objective will be to obtain greater accuracy better than the previously achieved.

Keywords: model parameters, expectation maximization algorithm, protein secondary structure prediction, bioinformatics

Procedia PDF Downloads 481
526 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network

Authors: Nasrin Bakhshizadeh, Ashkan Forootan

Abstract:

A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.

Keywords: polyethylene, polymerization, density, melt index, neural network

Procedia PDF Downloads 144
525 Ensemble Sampler For Infinite-Dimensional Inverse Problems

Authors: Jeremie Coullon, Robert J. Webber

Abstract:

We introduce a Markov chain Monte Carlo (MCMC) sam-pler for infinite-dimensional inverse problems. Our sam-pler is based on the affine invariant ensemble sampler, which uses interacting walkers to adapt to the covariance structure of the target distribution. We extend this ensem-ble sampler for the first time to infinite-dimensional func-tion spaces, yielding a highly efficient gradient-free MCMC algorithm. Because our ensemble sampler does not require gradients or posterior covariance estimates, it is simple to implement and broadly applicable. In many Bayes-ian inverse problems, Markov chain Monte Carlo (MCMC) meth-ods are needed to approximate distributions on infinite-dimensional function spaces, for example, in groundwater flow, medical imaging, and traffic flow. Yet designing efficient MCMC methods for function spaces has proved challenging. Recent gradi-ent-based MCMC methods preconditioned MCMC methods, and SMC methods have improved the computational efficiency of functional random walk. However, these samplers require gradi-ents or posterior covariance estimates that may be challenging to obtain. Calculating gradients is difficult or impossible in many high-dimensional inverse problems involving a numerical integra-tor with a black-box code base. Additionally, accurately estimating posterior covariances can require a lengthy pilot run or adaptation period. These concerns raise the question: is there a functional sampler that outperforms functional random walk without requir-ing gradients or posterior covariance estimates? To address this question, we consider a gradient-free sampler that avoids explicit covariance estimation yet adapts naturally to the covariance struc-ture of the sampled distribution. This sampler works by consider-ing an ensemble of walkers and interpolating and extrapolating between walkers to make a proposal. This is called the affine in-variant ensemble sampler (AIES), which is easy to tune, easy to parallelize, and efficient at sampling spaces of moderate dimen-sionality (less than 20). The main contribution of this work is to propose a functional ensemble sampler (FES) that combines func-tional random walk and AIES. To apply this sampler, we first cal-culate the Karhunen–Loeve (KL) expansion for the Bayesian prior distribution, assumed to be Gaussian and trace-class. Then, we use AIES to sample the posterior distribution on the low-wavenumber KL components and use the functional random walk to sample the posterior distribution on the high-wavenumber KL components. Alternating between AIES and functional random walk updates, we obtain our functional ensemble sampler that is efficient and easy to use without requiring detailed knowledge of the target dis-tribution. In past work, several authors have proposed splitting the Bayesian posterior into low-wavenumber and high-wavenumber components and then applying enhanced sampling to the low-wavenumber components. Yet compared to these other samplers, FES is unique in its simplicity and broad applicability. FES does not require any derivatives, and the need for derivative-free sam-plers has previously been emphasized. FES also eliminates the requirement for posterior covariance estimates. Lastly, FES is more efficient than other gradient-free samplers in our tests. In two nu-merical examples, we apply FES to challenging inverse problems that involve estimating a functional parameter and one or more scalar parameters. We compare the performance of functional random walk, FES, and an alternative derivative-free sampler that explicitly estimates the posterior covariance matrix. We conclude that FES is the fastest available gradient-free sampler for these challenging and multimodal test problems.

Keywords: Bayesian inverse problems, Markov chain Monte Carlo, infinite-dimensional inverse problems, dimensionality reduction

Procedia PDF Downloads 154
524 Exploring Public Trust in Democracy

Authors: Yaron Katz

Abstract:

The investigation of immigrants' electoral choices has remained relatively uncharted territory despite the fact that numerous nations extend political rights to their expatriates. This paper centers its attention on the matter of public trust in democracy, with a focus on the intricacies of Israeli politics as a divided system. It delves into the potential implications of political and social transformations stemming from the involvement of expatriate voters in elections taking place in their country of origin. In doing so, the article endeavors to explore a pathway for resolving a persistent challenge facing the stability of the Israeli political landscape over the past decade: the difficulty in forming a resilient government that genuinely represents the majority of voters. An examination is conducted into the role played by a demographic with the capacity to exert significant influence on election outcomes, namely, individuals residing outside of Israel. The objective of this research is to delve into this subject, dissecting social developments and political prospects that may shape the country's trajectory in the coming decades. This inquiry is especially pertinent given the extensive engagement of migrants in Israeli politics and the link between Israelis living abroad and their home country. Nevertheless, the study's findings reveal that while former citizens exhibit extensive involvement in Israeli politics and are cognizant of the potential consequences of permitting them to participate in elections, they maintain steadfastly unfavorable views regarding the inclusion of Israelis living overseas in their home country's electoral processes.

Keywords: trust, globalization, policy, democracy

Procedia PDF Downloads 45
523 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)

Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean

Abstract:

The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.

Keywords: pan evaporation, intelligent methods, shahroud, mayamey

Procedia PDF Downloads 74
522 Anthropometric Data Variation within Gari-Frying Population

Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe

Abstract:

The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.

Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design

Procedia PDF Downloads 253
521 Linguistic Features for Sentence Difficulty Prediction in Aspect-Based Sentiment Analysis

Authors: Adrian-Gabriel Chifu, Sebastien Fournier

Abstract:

One of the challenges of natural language understanding is to deal with the subjectivity of sentences, which may express opinions and emotions that add layers of complexity and nuance. Sentiment analysis is a field that aims to extract and analyze these subjective elements from text, and it can be applied at different levels of granularity, such as document, paragraph, sentence, or aspect. Aspect-based sentiment analysis is a well-studied topic with many available data sets and models. However, there is no clear definition of what makes a sentence difficult for aspect-based sentiment analysis. In this paper, we explore this question by conducting an experiment with three data sets: ”Laptops”, ”Restaurants”, and ”MTSC” (Multi-Target-dependent Sentiment Classification), and a merged version of these three datasets. We study the impact of domain diversity and syntactic diversity on difficulty. We use a combination of classifiers to identify the most difficult sentences and analyze their characteristics. We employ two ways of defining sentence difficulty. The first one is binary and labels a sentence as difficult if the classifiers fail to correctly predict the sentiment polarity. The second one is a six-level scale based on how many of the top five best-performing classifiers can correctly predict the sentiment polarity. We also define 9 linguistic features that, combined, aim at estimating the difficulty at sentence level.

Keywords: sentiment analysis, difficulty, classification, machine learning

Procedia PDF Downloads 89
520 Numerical Response of Planar HPGe Detector for 241Am Contamination of Various Shapes

Authors: M. Manohari, Himanshu Gupta, S. Priyadharshini, R. Santhanam, S. Chandrasekaran, B. Venkatraman

Abstract:

Injection is one of the potential routes of intake in a radioactive facility. The internal dose due to this intake is monitored at the radiation emergency medical centre, IGCAR using a portable planar HPGe detector. The contaminated wound may be having different shapes. In a reprocessing potential of wound contamination with actinide is more. Efficiency is one of the input parameters for estimation of internal dose. Estimating these efficiencies experimentally would be tedious and cumbersome. Numerical estimation can be a supplement to experiment. As an initial step in this study 241Am contamination of different shapes are studied. In this study portable planar HPGe detector was modeled using Monte Carlo code FLUKA and the effect of different parameters like distance of the contamination from the detector, radius of the circular contamination were studied. Efficiency values for point and surface contamination located at different distances were estimated. The effect of efficiency on the radius of the surface source was more predominant when the source is at 1 cm distance compared to when the source to detector distance is 10 cm. At 1 cm the efficiency decreased quadratically as the radius increased and at 10 cm it decreased linearly. The point source efficiency varied exponentially with source to detector distance.

Keywords: Planar HPGe, efficiency value, injection, surface source

Procedia PDF Downloads 42
519 Features of Calculating Structures for Frequent Weak Earthquakes

Authors: M. S. Belashov, A. V. Benin, Lin Hong, Sh. Sh. Nazarova, O. B. Sabirova, A. M. Uzdin, Lin Hong

Abstract:

The features of calculating structures for the action of weak earthquakes are analyzed. Earthquakes with a recurrence of 30 years and 50 years are considered. In the first case, the structure is to operate normally without damage after the earthquake. In the second case, damages are allowed that do not affect the possibility of the structure operation. Three issues are emphasized: setting elastic and damping characteristics of reinforced concrete, formalization of limit states, and combinations of loads. The dependence of damping on the reinforcement coefficient is estimated. When evaluating limit states, in addition to calculations for crack resistance and strength, a human factor, i.e., the possibility of panic among people, was considered. To avoid it, it is proposed to limit a floor-by-floor speed level in certain octave ranges. Proposals have been developed for estimating the coefficients of the combination of various loads with the seismic one. As an example, coefficients of combinations of seismic and ice loads are estimated. It is shown that for strong actions, the combination coefficients for different regions turn out to be close, while for weak actions, they may differ.

Keywords: weak earthquake, frequent earthquake, damage, limit state, reinforcement, crack resistance, strength resistance, a floor-by-floor velocity, combination coefficients

Procedia PDF Downloads 88
518 Experimental on Free and Forced Heat Transfer and Pressure Drop of Copper Oxide-Heat Transfer Oil Nanofluid in Horizontal and Inclined Microfin Tube

Authors: F. Hekmatipour, M. A. Akhavan-Behabadi, B. Sajadi

Abstract:

In this paper, the combined free and forced convection heat transfer of the Copper Oxide-Heat Transfer Oil (CuO-HTO) nanofluid flow in horizontal and inclined microfin tubes is studied experimentally. The flow regime is laminar, and pipe surface temperature is constant. The effect of nanoparticle and microfin tube on the heat transfer rate is investigated with the Richardson number which is between 0.1 and 0.7. The results show an increasing nanoparticle concentration between 0% and 1.5% leads to enhance the combined free and forced convection heat transfer rate. According to the results, five correlations are proposed to provide estimating the free and forced heat transfer rate as the increasing Richardson number from 0.1 to 0.7. The maximum deviation of both correlations is less than 16%. Moreover, four correlations are suggested to assess the Nusselt number based on the Rayleigh number in inclined tubes from 1800000 to 7000000. The maximum deviation of the correlation is almost 16%. The Darcy friction factor of the nanofluid flow has been investigated. Furthermore, CuO-HTO nanofluid flows in inclined microfin tubes.

Keywords: nanofluid, heat transfer oil, mixed convection, inclined tube, laminar flow

Procedia PDF Downloads 255
517 A Machine Learning-based Study on the Estimation of the Threat Posed by Orbital Debris

Authors: Suhani Srivastava

Abstract:

This research delves into the classification of orbital debris through machine learning (ML): it will categorize the intensity of the threat orbital debris poses through multiple ML models to gain an insight into effectively estimating the danger specific orbital debris can pose to future space missions. As the space industry expands, orbital debris becomes a growing concern in Low Earth Orbit (LEO) because it can potentially obfuscate space missions due to the increased orbital debris pollution. Moreover, detecting orbital debris and identifying its characteristics has become a major concern in Space Situational Awareness (SSA), and prior methods of solely utilizing physics can become inconvenient in the face of the growing issue. Thus, this research focuses on approaching orbital debris concerns through machine learning, an efficient and more convenient alternative, in detecting the potential threat certain orbital debris pose. Our findings found that the Logistic regression machine worked the best with a 98% accuracy and this research has provided insight into the accuracies of specific machine learning models when classifying orbital debris. Our work would help provide space shuttle manufacturers with guidelines about mitigating risks, and it would help in providing Aerospace Engineers facilities to identify the kinds of protection that should be incorporated into objects traveling in the LEO through the predictions our models provide.

Keywords: aerospace, orbital debris, machine learning, space, space situational awareness, nasa

Procedia PDF Downloads 20
516 Wellbore Stability Evaluation of Ratawi Shale Formation

Authors: Raed Hameed Allawi

Abstract:

Wellbore instability problems are considered the majority challenge for several wells in the Ratawi shale formation. However, it results in non-productive (NPT) time and increased well-drilling expenditures. This work aims to construct an integrated mechanical earth model (MEM) to predict the wellbore failure and design optimum mud weight to improve the drilling efficiency of future wells. The MEM was based on field data, including open-hole wireline logging and measurement data. Several failure criteria were applied in this work, including Modified Lade, Mogi-Coulomb, and Mohr-Coulomb that utilized to calculate the proper mud weight and practical drilling paths and orientations. Results showed that the leading cause of wellbore instability problems was inadequate mud weight. Moreover, some improper drilling practices and heterogeneity of Ratawi formation were additional causes of the increased risk of wellbore instability. Therefore, the suitable mud weight for safe drilling in the Ratawi shale formation should be 11.5-13.5 ppg. Furthermore, the mud weight should be increased as required depending on the trajectory of the planned well. The outcome of this study is as practical tools to reduce non-productive time and well costs and design future neighboring deviated wells to get high drilling efficiency. In addition, the current results serve as a reference for similar fields in that region because of the lacking of published studies regarding wellbore instability problems of the Ratawi Formation in southern Iraqi oilfields.

Keywords: wellbore stability, hole collapse, horizontal stress, MEM, mud window

Procedia PDF Downloads 191
515 Enhanced Method of Conceptual Sizing of Aircraft Electro-Thermal De-Icing System

Authors: Ahmed Shinkafi, Craig Lawson

Abstract:

There is a great advancement towards the All-Electric Aircraft (AEA) technology. The AEA concept assumes that all aircraft systems will be integrated into one electrical power source in the future. The principle of the electro-thermal system is to transfer the energy required for anti/de-icing to the protected areas in electrical form. However, powering a large aircraft anti-icing system electrically could be quite excessive in cost and system weight. Hence, maximising the anti/de-icing efficiency of the electro-thermal system in order to minimise its power demand has become crucial to electro-thermal de-icing system sizing. In this work, an enhanced methodology has been developed for conceptual sizing of aircraft electro-thermal de-icing System. The work factored those critical terms overlooked in previous studies which were critical to de-icing energy consumption. A case study of a typical large aircraft wing de-icing was used to test and validate the model. The model was used to optimise the system performance by a trade-off between the de-icing peak power and system energy consumption. The optimum melting surface temperatures and energy flux predicted enabled the reduction in the power required for de-icing. The weight penalty associated with electro-thermal anti-icing/de-icing method could be eliminated using this method without under estimating the de-icing power requirement.

Keywords: aircraft, de-icing system, electro-thermal, in-flight icing

Procedia PDF Downloads 517
514 The Relevance of the U-Shaped Learning Model to the Acquisition of the Difference between C'est and Il Est in the English Learners of French Context

Authors: Pooja Booluck

Abstract:

A U-shaped learning curve entails a three-step process: a good performance followed by a bad performance followed by a good performance again. U-shaped curves have been observed not only in language acquisition but also in various fields such as temperature face recognition object permanence to name a few. Building on previous studies of the curve child language acquisition and Second Language Acquisition this empirical study seeks to investigate the relevance of the U-shaped learning model to the acquisition of the difference between cest and il est in the English Learners of French context. The present study was developed to assess whether older learners of French in the ELF context follow the same acquisition pattern. The empirical study was conducted on 15 English learners of French which lasted six weeks. Compositions and questionnaires were collected from each subject at three time intervals (after one week after three weeks after six weeks) after which students work were graded as being either correct or incorrect. The data indicates that there is evidence of a U-shaped learning curve in the acquisition of cest and il est and students did follow the same acquisition pattern as children in regards to rote-learned terms and subject clitics. This paper also discusses the need to introduce modules on U-shaped learning curve in teaching curriculum as many teachers are unaware of the trajectory learners undertake while acquiring core components in grammar. In addition this study also addresses the need to conduct more research on the acquisition of rote-learned terms and subject clitics in SLA.

Keywords: child language acquisition, rote-learning, subject clitics, u-shaped learning model

Procedia PDF Downloads 293
513 Design of a Permanent Magnet Based Focusing Lens for a Miniature Klystron

Authors: Kumud Singh, Janvin Itteera, Priti Ukarde, Sanjay Malhotra, P. PMarathe, Ayan Bandyopadhay, Rakesh Meena, Vikram Rawat, L. M. Joshi

Abstract:

Application of Permanent magnet technology to high frequency miniature klystron tubes to be utilized for space applications improves the efficiency and operational reliability of these tubes. But nevertheless the task of generating magnetic focusing forces to eliminate beam divergence once the beam crosses the electrostatic focusing regime and enters the drift region in the RF section of the tube throws several challenges. Building a high quality magnet focusing lens to meet beam optics requirement in cathode gun and RF interaction region is considered to be one of the critical issues for these high frequency miniature tubes. In this paper, electromagnetic design and particle trajectory studies in combined electric and magnetic field for optimizing the magnetic circuit using 3D finite element method (FEM) analysis software is presented. A rectangular configuration of the magnet was constructed to accommodate apertures for input and output waveguide sections and facilitate coupling of electromagnetic fields into the input klystron cavity and out from output klystron cavity through coupling loops. Prototype lenses have been built and have been tested after integration with the klystron tube. We discuss the design requirements and challenges, and the results from beam transmission of the prototype lens.

Keywords: beam transmission, Brillouin, confined flow, miniature klystron

Procedia PDF Downloads 445