Search results for: daily probability model
19185 Implicit U-Net Enhanced Fourier Neural Operator for Long-Term Dynamics Prediction in Turbulence
Authors: Zhijie Li, Wenhui Peng, Zelong Yuan, Jianchun Wang
Abstract:
Turbulence is a complex phenomenon that plays a crucial role in various fields, such as engineering, atmospheric science, and fluid dynamics. Predicting and understanding its behavior over long time scales have been challenging tasks. Traditional methods, such as large-eddy simulation (LES), have provided valuable insights but are computationally expensive. In the past few years, machine learning methods have experienced rapid development, leading to significant improvements in computational speed. However, ensuring stable and accurate long-term predictions remains a challenging task for these methods. In this study, we introduce the implicit U-net enhanced Fourier neural operator (IU-FNO) as a solution for stable and efficient long-term predictions of the nonlinear dynamics in three-dimensional (3D) turbulence. The IU-FNO model combines implicit re-current Fourier layers to deepen the network and incorporates the U-Net architecture to accurately capture small-scale flow structures. We evaluate the performance of the IU-FNO model through extensive large-eddy simulations of three types of 3D turbulence: forced homogeneous isotropic turbulence (HIT), temporally evolving turbulent mixing layer, and decaying homogeneous isotropic turbulence. The results demonstrate that the IU-FNO model outperforms other FNO-based models, including vanilla FNO, implicit FNO (IFNO), and U-net enhanced FNO (U-FNO), as well as the dynamic Smagorinsky model (DSM), in predicting various turbulence statistics. Specifically, the IU-FNO model exhibits improved accuracy in predicting the velocity spectrum, probability density functions (PDFs) of vorticity and velocity increments, and instantaneous spatial structures of the flow field. Furthermore, the IU-FNO model addresses the stability issues encountered in long-term predictions, which were limitations of previous FNO models. In addition to its superior performance, the IU-FNO model offers faster computational speed compared to traditional large-eddy simulations using the DSM model. It also demonstrates generalization capabilities to higher Taylor-Reynolds numbers and unseen flow regimes, such as decaying turbulence. Overall, the IU-FNO model presents a promising approach for long-term dynamics prediction in 3D turbulence, providing improved accuracy, stability, and computational efficiency compared to existing methods.Keywords: data-driven, Fourier neural operator, large eddy simulation, fluid dynamics
Procedia PDF Downloads 7419184 Comparison of Receiver Operating Characteristic Curve Smoothing Methods
Authors: D. Sigirli
Abstract:
The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve
Procedia PDF Downloads 15219183 Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule
Authors: Ming-Jong Yao, Chin-Sum Shui, Chih-Han Wang
Abstract:
This paper is developed based on a real-world decision scenario that an industrial gas company that applies the Vendor Managed Inventory model and supplies liquid oxygen with a self-operated heterogeneous vehicle fleet to hospitals in nearby cities. We name it as a Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule and formulate it as a non-linear mixed-integer linear programming problem which simultaneously determines the length of the planning cycle (PC), the length of the replenishment cycle and the dates of replenishment for each customer and the vehicle routes of each day within PC, such that the average daily operation cost within PC, including inventory holding cost, setup cost, transportation cost, and overtime labor cost, is minimized. A solution method based on genetic algorithm, embedded with an encoding and decoding mechanism and local search operators, is then proposed, and the hash function is adopted to avoid repetitive fitness evaluation for identical solutions. Numerical experiments demonstrate that the proposed solution method can effectively solve the problem under different lengths of PC and number of customers. The method is also shown to be effective in determining whether the company should expand the storage capacity of a customer whose demand increases. Sensitivity analysis of the vehicle fleet composition shows that deploying a mixed fleet can reduce the daily operating cost.Keywords: cyclic inventory routing problem, joint replenishment, heterogeneous vehicle, genetic algorithm
Procedia PDF Downloads 8719182 Agritourism Development Mode Study in Rural Area of Boshan China
Authors: Lingfei Sun
Abstract:
Based on the significant value of ecology, the strategic planning for ecological civilization construction was mentioned in the 17th and 18th National Congress of the Communist Party of China. How to generate economic value based on the environmental capacity is not only an economic decision but also a political decision to make. Boshan took the full use of “Ecology” and transformed it as an inexhaustible green resource to benefit people, reflecting the sustainable value of new agriculture development mode. The Strawberry Harvest Festival and Blueberry Harvest Festival hosted approximately 96,000 and 54,000 leisure tourists respectively in 2014. For the Kiwi Harvest Festival in August 2014, in average, it attracted about 4600 tourists per day, which generated daily kiwi sales of 50,000 lbs and 3 million RMB (About 476,000 USD) of daily revenue. The purpose of this study is to elaborate the modes of agritourism development, by analyzing the cases in rural area of Boshan, China. Interviews with the local government officers were applied to discover operation mode of agritourism operation. The financial data was used to demonstrate the strength of government policy and improvement of the income of rural people. The result indicated that there are mainly three types of modes: the Intensive Mode, the Model Mode and the Mixed Mode, supported by case study respectively. With the boom of tourism, the development of agritourism in Boshan relies on the agriculture encouraging policy of China and the effort of local government; meanwhile, large scale of cultivation and the product differentiation are the crucial elements for the success of rural agritourism projects.Keywords: agriculture, agritourism, economy, rural area development
Procedia PDF Downloads 30819181 Using Geo-Statistical Techniques and Machine Learning Algorithms to Model the Spatiotemporal Heterogeneity of Land Surface Temperature and its Relationship with Land Use Land Cover
Authors: Javed Mallick
Abstract:
In metropolitan areas, rapid changes in land use and land cover (LULC) have ecological and environmental consequences. Saudi Arabia's cities have experienced tremendous urban growth since the 1990s, resulting in urban heat islands, groundwater depletion, air pollution, loss of ecosystem services, and so on. From 1990 to 2020, this study examines the variance and heterogeneity in land surface temperature (LST) caused by LULC changes in Abha-Khamis Mushyet, Saudi Arabia. LULC was mapped using the support vector machine (SVM). The mono-window algorithm was used to calculate the land surface temperature (LST). To identify LST clusters, the local indicator of spatial associations (LISA) model was applied to spatiotemporal LST maps. In addition, the parallel coordinate (PCP) method was used to investigate the relationship between LST clusters and urban biophysical variables as a proxy for LULC. According to LULC maps, urban areas increased by more than 330% between 1990 and 2018. Between 1990 and 2018, built-up areas had an 83.6% transitional probability. Furthermore, between 1990 and 2020, vegetation and agricultural land were converted into built-up areas at a rate of 17.9% and 21.8%, respectively. Uneven LULC changes in built-up areas result in more LST hotspots. LST hotspots were associated with high NDBI but not NDWI or NDVI. This study could assist policymakers in developing mitigation strategies for urban heat islandsKeywords: land use land cover mapping, land surface temperature, support vector machine, LISA model, parallel coordinate plot
Procedia PDF Downloads 7819180 Non−zero θ_13 and δ_CP phase with A_4 Flavor Symmetry and Deviations to Tri−Bi−Maximal mixing via Z_2 × Z_2 invariant perturbations in the Neutrino sector.
Authors: Gayatri Ghosh
Abstract:
In this work, a flavour theory of a neutrino mass model based on A_4 symmetry is considered to explain the phenomenology of neutrino mixing. The spontaneous symmetry breaking of A_4 symmetry in this model leads to tribimaximal mixing in the neutrino sector at a leading order. We consider the effect of Z_2 × Z_2 invariant perturbations in neutrino sector and find the allowed region of correction terms in the perturbation matrix that is consistent with 3σ ranges of the experimental values of the mixing angles. We study the entanglement of this formalism on the other phenomenological observables, such as δ_CP phase, the neutrino oscillation probability P(νµ → νe), the effective Majorana mass |mee| and |meff νe |. A Z_2 × Z_2 invariant perturbations in this model is introduced in the neutrino sector which leads to testable predictions of θ_13 and CP violation. By changing the magnitudes of perturbations in neutrino sector, one can generate viable values of δ_CP and neutrino oscillation parameters. Next we investigate the feasibility of charged lepton flavour violation in type-I seesaw models with leptonic flavour symmetries at high energy that leads to tribimaximal neutrino mixing. We consider an effective theory with an A_4 × Z_2 × Z_2 symmetry, which after spontaneous symmetry breaking at high scale which is much higher than the electroweak scale leads to charged lepton flavour violation processes once the heavy Majorana neutrino mass degeneracy is lifted either by renormalization group effects or by a soft breaking of the A_4 symmetry. In this context the implications for charged lepton flavour violation processes like µ → eγ, τ → eγ, τ → µγ are discussed.Keywords: Z2 × Z2 invariant perturbations, CLFV, delta CP phase, tribimaximal neutrino mixing
Procedia PDF Downloads 7919179 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 17619178 Gamification Using Stochastic Processes: Engage Children to Have Healthy Habits
Authors: Andre M. Carvalho, Pedro Sebastiao
Abstract:
This article is based on a dissertation that intends to analyze and make a model, intelligently, algorithms based on stochastic processes of a gamification application applied to marketing. Gamification is used in our daily lives to engage us to perform certain actions in order to achieve goals and gain rewards. This strategy is an increasingly adopted way to encourage and retain customers through game elements. The application of gamification aims to encourage children between 6 and 10 years of age to have healthy habits and the purpose of serving as a model for use in marketing. This application was developed in unity; we implemented intelligent algorithms based on stochastic processes, web services to respond to all requests of the application, a back-office website to manage the application and the database. The behavioral analysis of the use of game elements and stochastic processes in children’s motivation was done. The application of algorithms based on stochastic processes in-game elements is very important to promote cooperation and to ensure fair and friendly competition between users which consequently stimulates the user’s interest and their involvement in the application and organization.Keywords: engage, games, gamification, randomness, stochastic processes
Procedia PDF Downloads 33119177 Approximation of the Time Series by Fractal Brownian Motion
Authors: Valeria Bondarenko
Abstract:
In this paper, we propose two problems related to fractal Brownian motion. First problem is simultaneous estimation of two parameters, Hurst exponent and the volatility, that describe this random process. Numerical tests for the simulated fBm provided an efficient method. Second problem is approximation of the increments of the observed time series by a power function by increments from the fractional Brownian motion. Approximation and estimation are shown on the example of real data, daily deposit interest rates.Keywords: fractional Brownian motion, Gausssian processes, approximation, time series, estimation of properties of the model
Procedia PDF Downloads 37619176 Topic Modelling Using Latent Dirichlet Allocation and Latent Semantic Indexing on SA Telco Twitter Data
Authors: Phumelele Kubheka, Pius Owolawi, Gbolahan Aiyetoro
Abstract:
Twitter is one of the most popular social media platforms where users can share their opinions on different subjects. As of 2010, The Twitter platform generates more than 12 Terabytes of data daily, ~ 4.3 petabytes in a single year. For this reason, Twitter is a great source for big mining data. Many industries such as Telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model represented in Table 1. A higher topic coherence score indicates better performance of the model.Keywords: big data, latent Dirichlet allocation, latent semantic indexing, telco, topic modeling, twitter
Procedia PDF Downloads 15019175 Multi-Level Security Measures in Cloud Computing
Authors: Shobha G. Ranjan
Abstract:
Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.Keywords: cloud computing, cloud security, integrity, multi-tenancy, security
Procedia PDF Downloads 50119174 Effects of Polyvictimization in Suicidal Ideation among Children and Adolescents in Chile
Authors: Oscar E. Cariceo
Abstract:
In Chile, there is a lack of evidence about the impact of polyvictimization on the emergence of suicidal thoughts among children and young people. Thus, this study aims to explore the association between the episodes of polyvictimization suffered by Chilean children and young people and the manifestation of signs related to suicidal tendencies. To achieve this purpose, secondary data from the First Polyvictimization Survey on Children and Adolescents of 2017 were analyzed, and a binomial logistic regression model was applied to establish the probability that young people are experiencing suicidal ideation episodes. The main findings show that women between the ages of 13 and 15 years, who are in seventh grade and second in subsidized schools, are more likely to express suicidal ideas, which increases if they have suffered different types of victimization, particularly physical violence, psychological aggression, and sexual abuse.Keywords: Chile, polyvictimization, suicidal ideation, youth
Procedia PDF Downloads 17819173 Factors Associated with Ketamine Use in Pancreatic Cancer Patient in a Single Hospice Center
Authors: Kyung Min Kwom, Young Joo Lee
Abstract:
Purpose: Up to 90% of pancreatic cancer patient suffer from neuropathic pain. In palliative care setting, pain control in a pancreatic cancer patient is one of the major goals. Ketamine is a NMDA receptor antagonist effective in neuropathic pain. Also, there have been studies about opioid sparing effect of ketamine. This study was held in palliative care unit among pancreatic cancer patients to find out the factors related to ketamine use and the opioid sparing effect. Methods: Medical records of pancreatic cancer patients admitted to St. Mary’s hospital palliative care unit from 2013.1 to 2014.12 were reviewed. Patients were divided into two categories according to ketamine use. Also, opioid use before and after ketamine use was compared in ketamine group. Results: Compared to non ketamine use group, patients in ketamine group required a higher dose of opioid. Total opioid dose, daily opioid dose, number of daily rescue medication, daily average rescue dose were statistically significantly higher in ketamine group. Opioid requirement was increased after ketamine administration. Conclusion: In this study, ketamine group required more opioid. Ketamine is frequently considered in patients with severe pain, requiring high amount of opioid. Also, ketamine did not have an opioid sparing effect. Future studies about palliative use of ketamine in a larger number of patients are required.Keywords: ketamine, opioid sparing, palliative care, pancreatic cancer
Procedia PDF Downloads 23419172 Amazon and Its AI Features
Authors: Leen Sulaimani, Maryam Hafiz, Naba Ali, Roba Alsharif
Abstract:
One of Amazon’s most crucial online systems is artificial intelligence. Amazon would not have a worldwide successful online store, an easy and secure way of payment, and other services if it weren’t for artificial intelligence and machine learning. Amazon uses AI to expand its operations and enhance them by upgrading the website daily; having a strong base of artificial intelligence in a worldwide successful business can improve marketing, decision-making, feedback, and more qualities. Aiming to have a rational AI system in one’s business should be the start of any process; that is why Amazon is fortunate that they keep taking care of the base of their business by using modern artificial intelligence, making sure that it is stable, reaching their organizational goals, and will continue to thrive more each and every day. Artificial intelligence is used daily in our current world and is still being amplified more each day to reach consumer satisfaction and company short and long-term goals.Keywords: artificial intelligence, Amazon, business, customer, decision making
Procedia PDF Downloads 11019171 Using Machine Learning as an Alternative for Predicting Exchange Rates
Authors: Pedro Paulo Galindo Francisco, Eli Dhadad Junior
Abstract:
This study addresses the Meese-Rogoff Puzzle by introducing the latest machine learning techniques as alternatives for predicting the exchange rates. Using RMSE as a comparison metric, Meese and Rogoff discovered that economic models are unable to outperform the random walk model as short-term exchange rate predictors. Decades after this study, no statistical prediction technique has proven effective in overcoming this obstacle; although there were positive results, they did not apply to all currencies and defined periods. Recent advancements in artificial intelligence technologies have paved the way for a new approach to exchange rate prediction. Leveraging this technology, we applied five machine learning techniques to attempt to overcome the Meese-Rogoff puzzle. We considered daily data for the real, yen, British pound, euro, and Chinese yuan against the US dollar over a time horizon from 2010 to 2023. Our results showed that none of the presented techniques were able to produce an RMSE lower than the Random Walk model. However, the performance of some models, particularly LSTM and N-BEATS were able to outperform the ARIMA model. The results also suggest that machine learning models have untapped potential and could represent an effective long-term possibility for overcoming the Meese-Rogoff puzzle.Keywords: exchage rate, prediction, machine learning, deep learning
Procedia PDF Downloads 3219170 Perceptions of Community Members in Lephalale Area, Limpopo Province, Towards Water Conservation: Development of a Psychological Model
Authors: M. L. Seretlo-Rangata, T. Sodi, S. Govender
Abstract:
Despite interventions by various governments to regulate water demand and address water scarcity, literature shows that billions of people across the world continue to struggle with access because not everyone contributes equally to conservation efforts. Behavioral factors such as individual and collective aspects of cognition and commitment have been found to play an important role in water conservation. The aim of the present study was to explore the perceptions of community members in the Lephalale area, Limpopo province, towards water conservation with a view to developing an explanatory psychological model on water conservation. Twenty (20) participants who relied on communal taps to access water in Lephalale Local Municipality, Limpopo province, were selected through purposeful sampling. In-depth, semi-structured, individual face-to-face interviews were used to gather data and were analyzed utilizing thematic content analysis (TCA). The research findings revealed that there are various psychological effects of water scarcity on communities, such as emotional distress, interpersonal conflicts and disruptions of daily activities of living. Additionally, the study results showed that the coping strategies developed by participants to deal with water scarcity included adopting alternative water use behaviors as well as adjusting current behaviors and lifestyles. Derived from the study findings, a psychological model of water conservation was developed. The model incorporates some ideas from the Value-Belief-Norm (VBN) theory and the Afrocentric theory. The model suggests that people’s worldviews, including their values, beliefs and culture, are significant determinants of their pro-environmental behaviors. The study concludes by recommending that authorities and policymakers should consider psychological factors when developing water management programs, strategies and interventions with the consultation of psychology experts.Keywords: water conservation, psychological model, pro-environmental behaviour, conservation psychology, water-use behaviour
Procedia PDF Downloads 7119169 Effect of Different SE Diets on Blood SE, TAC Levels in Dairy Cattle and Their Newborn Calves
Authors: Moshfeghi Sogand
Abstract:
Free radicals can be produced during the respiratory oxidation of different cells. These free radicals can damage to various macromolecules as protein ,fat, nucleic acids and … are harmful for body. The natural defence system that can prevent the damage of free radicals and nuteralized them , have tittled under the name total antioxidant capacity (TAC ). Se is one main antioxidant part in TAC , because it is one main part in structure of some body antioxidant enzymes such as GPX(glutathione peroxidase). Blood SE ,GPX and TAC probably can change by feeding of different selenium supplement diet in late pregnancy and also may transport from maternal blood to its fetus or by clostrum after calving. In this respect we have determined 100 pregnant dairy cattle (in the same condition of age , race and number of parturient) then devided them to 4 groups feed them in 3 last pregnancy months by different selenium diets. Group1:controle no se supplementation , group2: recived 0/3 ppm of the daily diet Saccharomyces Cervisiae . group3 :recived selenium _ rich yeast(containing200ppm selenium)was mixed with total daily ration fed. Group4: recived se _rich yeast(containing300 ppm selenium)was mixed with total daily ration fed. Then measured blood SE,GPX and TAC levels in them and in 3 days newborn calves after calving. The results were analysed by Tukey Anova test and the highest level of blood SE ,GPX and TAC was shown in cattle that feed fermented SE_yeast diet and in their 3 days newborn calves.Keywords: SE, TAC, SE DIETS, FRAP
Procedia PDF Downloads 4419168 Socioeconomic Status and Gender Influence on Linguistic Change: A Case Study on Language Competence and Confidence of Multilingual Minority Language Speakers
Authors: Stefanie Siebenhütter
Abstract:
Male and female speakers use language differently and with varying confidence levels. This paper contrasts gendered differences in language use with socioeconomic status and age factors. It specifically examines how Kui minority language use and competence are conditioned by the variable of gender and discusses potential reasons for this variation by examining gendered language awareness and sociolinguistic attitudes. Moreover, it discusses whether women in Kui society function as 'leaders of linguistic change', as represented in Labov’s sociolinguistic model. It discusses whether societal role expectations in collectivistic cultures influence the model of linguistic change. The findings reveal current Kui speaking preferences and give predictions on the prospective language use, which is a stable situation of multilingualism because the current Kui speakers will socialize and teach the prospective Kui speakers in the near future. It further confirms that Lao is losing importance in Kui speaker’s (female’s) daily life.Keywords: gender, identity construction, language change, minority language, multilingualism, sociolinguistics, social Networks
Procedia PDF Downloads 17719167 Comparison of Home Ranges of Radio Collared Jaguars (Panthera onca L.) in the Dry Chaco and Wet Chaco of Paraguay
Authors: Juan Facetti, Rocky McBride, Karina Loup
Abstract:
The Chaco Region of Paraguay is a key biodiverse area for the conservation of jaguars (Panthera onca), the largest feline of the Americas. It comprises five eco-regions, which holds important but decreasing populations of this species. The last decades, the expansion of soybean over the Atlantic Forest, forced the translocation of cattle-ranches towards the Chaco. Few studies of Jaguar's population densities in the American hemisphere were done until now. In the region, the specie is listed as vulnerable or threatened and more information is needed to implement any conservation policy. Among the factors that threaten the populations are land-use change, habitat fragmentation, prey depletion and illegal hunting. Two largest eco-regions were studied: the Wet Chaco and the Dry Chaco. From 2002 more than 20 jaguars were captured and fitted with GPS-collar. Data collected from 11 GPS-collars were processed, transformed numerically and finally converted into maps for analyzing. 8.092 locations were determined for four adult females (AF) and one adult male (AM) in the Wet Chaco, and one AF, one juvenile male (JM) and four AM in the Dry Chaco, during 1,867 days. GIS and kernel methodology were used to calculate daily distance of movement, home range-HR (95% isopleth), and core area (considered as 50% isopleth). In the Wet Chaco HR were 56 Km2 and 238 km2 for females and males respectively; while in the Dry Chaco HR were 685 Km2 and 844.5 km2 for females and males respectively, and 172 Km2 for a juvenile. Core areas of individual activity for each jaguar, were on average 11.5 Km2 and 33.55 km2 for AF and AM respectively in the Wet Chaco, while in the Dry Chaco were larger: 115 km2 for five AM and 225 Km2 for an AF and 32.4 Km2 for a JM. In both ecoregions, only one relevant overlap of HR of adults was reported. During the reproduction season, the HR (95% K) of one AM overlapped 49.83% with that of one AF. At the Wet Chaco, the maximum daily distance moved by an AF was 14.5 Km and 11.6 Km for the AM, while the Maximum Mean Daily Moved (MMDM) distance was 5.6 km for an AF and 3.1 km for an AM. At the Dry Chaco, the maximum daily distance for an AF was 61.7Km., 50.9Km for the AM and 6.6 Km for the JM, while the MMDM distance was 13.2 km for an AM and 8.4 km for an AF. This study confirmed that, as the invasion to jaguar habitat increased, it resulted in fragmented landscapes that influence spacing patterns of jaguars. Males used largest HR that of the smaller females and males covers largest distances that of the females. There appeared to be important spatial segregation between not only females but also males. It is likely that the larger areas used by males are partly caused by the sexual dimorphism in body size that entails differences in prey requirements. These could explain the larger distances travelled daily by males.Keywords: Chaco ecoregions, Jaguar, home range, Panthera onca, Paraguay
Procedia PDF Downloads 30219166 Two-Sided Information Dissemination in Takeovers: Disclosure and Media
Authors: Eda Orhun
Abstract:
Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success
Procedia PDF Downloads 31819165 Organizational Innovations of the 20th Century as High Tech of the 21st: Evidence from Patent Data
Authors: Valery Yakubovich, Shuping wu
Abstract:
Organization theorists have long claimed that organizational innovations are nontechnological, in part because they are unpatentable. The claim rests on the assumption that organizational innovations are abstract ideas embodied in persons and contexts rather than in context-free practical tools. However, over the last three decades, organizational knowledge has been increasingly embodied in digital tools which, in principle, can be patented. To provide the first empirical evidence regarding the patentability of organizational innovations, we trained two machine learning algorithms to identify a population of 205,434 patent applications for organizational technologies (OrgTech) and, among them, 141,285 applications that use organizational innovations accumulated over the 20th century. Our event history analysis of the probability of patenting an OrgTech invention shows that ideas from organizational innovations decrease the probability of patent allowance unless they describe a practical tool. We conclude that the present-day digital transformation places organizational innovations in the realm of high tech and turns the debate about organizational technologies into the challenge of designing practical organizational tools that embody big ideas about organizing. We outline an agenda for patent-based research on OrgTech as an emerging phenomenon.Keywords: organizational innovation, organizational technology, high tech, patents, machine learning
Procedia PDF Downloads 12219164 A Development of a Weight-Balancing Control System Based On Android Operating System
Authors: Rattanathip Rattanachai, Piyachai Petchyen, Kunyanuth Kularbphettong
Abstract:
This paper describes the development of a Weight- Balancing Control System based on the Android Operating System and it provides recommendations on ways of balancing of user’s weight based on daily metabolism process and need so that user can make informed decisions on his or her weight controls. The system also depicts more information on nutrition details. Furthermore, it was designed to suggest to users what kinds of foods they should eat and how to exercise in the right ways. We describe the design methods and functional components of this prototype. To evaluate the system performance, questionnaires for system usability and Black Box Testing were used to measure expert and user satisfaction. The results were satisfactory as followed: Means for experts and users were 3.94 and 4.07 respectively.Keywords: weight-balancing control, Android operating system, daily metabolism, black box testing
Procedia PDF Downloads 47119163 Temperature Distribution for Asphalt Concrete-Concrete Composite Pavement
Authors: Tetsya Sok, Seong Jae Hong, Young Kyu Kim, Seung Woo Lee
Abstract:
The temperature distribution for asphalt concrete (AC)-Concrete composite pavement is one of main influencing factor that affects to performance life of pavement. The temperature gradient in concrete slab underneath the AC layer results the critical curling stress and lead to causes de-bonding of AC-Concrete interface. These stresses, when enhanced by repetitive axial loadings, also contribute to the fatigue damage and eventual crack development within the slab. Moreover, the temperature change within concrete slab extremely causes the slab contracts and expands that significantly induces reflective cracking in AC layer. In this paper, the numerical prediction of pavement temperature was investigated using one-dimensional finite different method (FDM) in fully explicit scheme. The numerical predicted model provides a fundamental and clear understanding of heat energy balance including incoming and outgoing thermal energies in addition to dissipated heat in the system. By using the reliable meteorological data for daily air temperature, solar radiation, wind speech and variable pavement surface properties, the predicted pavement temperature profile was validated with the field measured data. Additionally, the effects of AC thickness and daily air temperature on the temperature profile in underlying concrete were also investigated. Based on obtained results, the numerical predicted temperature of AC-Concrete composite pavement using FDM provided a good accuracy compared to field measured data and thicker AC layer significantly insulates the temperature distribution in underlying concrete slab.Keywords: asphalt concrete, finite different method (FDM), curling effect, heat transfer, solar radiation
Procedia PDF Downloads 26919162 Evaluation of a Staffing to Workload Tool in a Multispecialty Clinic Setting
Authors: Kristin Thooft
Abstract:
— Increasing pressure to manage healthcare costs has resulted in shifting care towards ambulatory settings and is driving a focus on cost transparency. There are few nurse staffing to workload models developed for ambulatory settings, less for multi-specialty clinics. Of the existing models, few have been evaluated against outcomes to understand any impact. This evaluation took place after the AWARD model for nurse staffing to workload was implemented in a multi-specialty clinic at a regional healthcare system in the Midwest. The multi-specialty clinic houses 26 medical and surgical specialty practices. The AWARD model was implemented in two specialty practices in October 2020. Donabedian’s Structure-Process-Outcome (SPO) model was used to evaluate outcomes based on changes to the structure and processes of care provided. The AWARD model defined and quantified the processes, recommended changes in the structure of day-to-day nurse staffing. Cost of care per patient visit, total visits, a total nurse performed visits used as structural and process measures, influencing the outcomes of cost of care and access to care. Independent t-tests were used to compare the difference in variables pre-and post-implementation. The SPO model was useful as an evaluation tool, providing a simple framework that is understood by a diverse care team. No statistically significant changes in the cost of care, total visits, or nurse visits were observed, but there were differences. Cost of care increased and access to care decreased. Two weeks into the post-implementation period, the multi-specialty clinic paused all non-critical patient visits due to a second surge of the COVID-19 pandemic. Clinic nursing staff was re-allocated to support the inpatient areas. This negatively impacted the ability of the Nurse Manager to utilize the AWARD model to plan daily staffing fully. The SPO framework could be used for the ongoing assessment of nurse staffing performance. Additional variables could be measured, giving a complete picture of the impact of nurse staffing. Going forward, there must be a continued focus on the outcomes of care and the value of nursingKeywords: ambulatory, clinic, evaluation, outcomes, staffing, staffing model, staffing to workload
Procedia PDF Downloads 17319161 A Study on the Safety Evaluation of Pier According to the Water Level Change by the Monte-Carlo Method
Authors: Minho Kwon, Jeonghee Lim, Yeongseok Jeong, Donghoon Shin, Kiyoung Kim
Abstract:
Recently, global warming phenomenon has led to natural disasters caused by global environmental changes, and due to abnormal weather events, the frequency and intensity of heavy rain storm typhoons are increasing. Therefore, it is imperative to prepare for future heavy rain storms and typhoons. This study selects arbitrary target bridges and performs numerical analysis to evaluate the safety of bridge piers in the event that the water level changes. The numerical model is based on two-dimensional surface elements. Actual reinforced concrete was simulated by modeling concrete to include reinforcements, and a contact boundary model was applied between the ground and the concrete. The water level applied to the piers was considered at 18 levels between 7.5 m and 16.1 m. The elastic modulus, compressive strength, tensile strength, and yield strength of the reinforced concrete were calculated using 250 random combinations and numerical analysis was carried out for each water level. In the results of analysis, the bridge exceeded the stated limit at 15.0 m. At the maximum water level of 16.1m, the concrete’s failure rate was 35.2%, but the probability that the reinforcement would fail was 61.2%.Keywords: Monte-Carlo method, pier, water level change, limit state
Procedia PDF Downloads 28619160 Determinants of Income Diversification among Support Zone Communities of National Parks in Nigeria
Authors: Daniel Etim Jacob, Samuel Onadeko, Edem A. Eniang, Imaobong Ufot Nelson
Abstract:
This paper examined determinants of income diversification among households in support zones communities of national parks in Nigeria. This involved the use household data collected through questionnaires administered randomly among 1009 household heads in the study area. The data obtained were analyzed using probability and non-probability statistical analysis such as regression and analysis of variance to test for mean difference between parks. The result obtained indicates that majority of the household heads were male (92.57%0, between the age class of 21 – 40 years (44.90%), had non-formal education (38.16%), were farmers (65.21%), owned land (95.44%), with a household size of 1 – 5 (36.67%) and an annual income range of ₦401,000 - ₦600,000 (24.58%). Mean Simpson index of diversity showed a general low (0.375) level of income diversification among the households. Income, age, off-farm dependence, education, household size and occupation where significant (p<0.01) factors that affected households’ income diversification. The study recommends improvement in the existing infrastructures and social capital in the communities as avenues to improve the livelihood and ensure positive conservation behaviors in the study area.Keywords: income diversification, protected area, livelihood, poverty, Nigeria
Procedia PDF Downloads 14319159 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks
Authors: Ahmed Abdullah Ahmed
Abstract:
The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments
Procedia PDF Downloads 51219158 Modeling Soil Erosion and Sediment Yield in Geba Catchment, Ethiopia
Authors: Gebremedhin Kiros, Amba Shetty, Lakshman Nandagiri
Abstract:
Soil erosion is a major threat to the sustainability of land and water resources in the catchment and there is a need to identify critical areas of erosion so that suitable conservation measures may be adopted. The present study was taken up to understand the temporal and spatial distribution of soil erosion and daily sediment yield in Geba catchment (5137 km2) located in the Northern Highlands of Ethiopia. Soil and Water Assessment Tool (SWAT) was applied to the Geba catchment using data pertaining to rainfall, climate, soils, topography and land use/land cover (LU/LC) for the historical period 2000-2013. LU/LC distribution in the catchment was characterized using LANDSAT satellite imagery and the GIS-based ArcSWAT version of the model. The model was calibrated and validated using sediment concentration measurements made at the catchment outlet. The catchment was divided into 13 sub-basins and based on estimated soil erosion, these were prioritized on the basis of susceptibility to soil erosion. Model results indicated that the average sediment yield estimated of the catchment was 12.23 tons/ha/yr. The generated soil loss map indicated that a large portion of the catchment has high erosion rates resulting in significantly large sediment yield at the outlet. Steep and unstable terrain, the occurrence of highly erodible soils and low vegetation cover appeared to favor high soil erosion. Results obtained from this study prove useful in adopting in targeted soil and water conservation measures and promote sustainable management of natural resources in the Geba and similar catchments in the region.Keywords: Ethiopia, Geba catchment, MUSLE, sediment yield, SWAT Model
Procedia PDF Downloads 31319157 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software
Authors: Michael Williams
Abstract:
The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.Keywords: well control, fluid mechanics, safety, environment
Procedia PDF Downloads 17119156 A Preliminary Study of Urban Resident Space Redundancy in the Context of Rapid Urbanization: Based on Urban Research of Hongkou District of Shanghai
Authors: Ziwei Chen, Yujiang Gao
Abstract:
The rapid urbanization has caused the massive physical space in Chinese cities to be in a state of duplication and dislocation through the rapid development, forming many daily spaces that cannot be standardized, typed, and identified, such as illegal construction. This phenomenon is known as urban spatial redundancy and is often excluded from mainstream architectural discussions because of its 'remaining' and 'excessive' derogatory label. In recent years, some practice architects have begun to pay attention to this phenomenon and tried to tap the value behind it. In this context, the author takes the redundancy phenomenon of resident space as the research object and explores the inspiration to the urban architectural renewal and the innovative residential area model, based on the urban survey of redundant living space in Hongkou District of Shanghai. On this basis, it shows that the changes accumulated in the long-term use of the building can be re-applied to the goals before the design, which is an important link and significance of the existence of an architecture.Keywords: rapid urbanization, living space redundancy, architectural renewal, residential area model
Procedia PDF Downloads 135