Search results for: coverage probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1880

Search results for: coverage probability

1550 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information

Procedia PDF Downloads 814
1549 The Best Prediction Data Mining Model for Breast Cancer Probability in Women Residents in Kabul

Authors: Mina Jafari, Kobra Hamraee, Saied Hossein Hosseini

Abstract:

The prediction of breast cancer disease is one of the challenges in medicine. In this paper we collected 528 records of women’s information who live in Kabul including demographic, life style, diet and pregnancy data. There are many classification algorithm in breast cancer prediction and tried to find the best model with most accurate result and lowest error rate. We evaluated some other common supervised algorithms in data mining to find the best model in prediction of breast cancer disease among afghan women living in Kabul regarding to momography result as target variable. For evaluating these algorithms we used Cross Validation which is an assured method for measuring the performance of models. After comparing error rate and accuracy of three models: Decision Tree, Naive Bays and Rule Induction, Decision Tree with accuracy of 94.06% and error rate of %15 is found the best model to predicting breast cancer disease based on the health care records.

Keywords: decision tree, breast cancer, probability, data mining

Procedia PDF Downloads 140
1548 Women's Liberation: A Study of the Movement in Saudi Arabia

Authors: Rachel Hasan

Abstract:

Kingdom of Saudi Arabia has witnessed various significant social and political developments in 2018. Crown Prince of Kingdom of Saudi Arabia, Muhammad bin Salman, also serving as Deputy Prime Minister of Saudi Arabia, has made several social, cultural, and political changes in the country under his grand National Transformation Program. Program provides a vision of more economically viable, culturally liberal, and politically pleasant Saudi Arabia. One of the most significant and ground breaking changes that has been made under this program is awarding women the long awaited rights. Legislative changes are made to allow woman to drive. Seemingly basic on surface but driving rights to women represent much deeper meaning to the culture of Saudi Arabia and to the world outside. Ever since this right is awarded to the women, world media is interpreting this change in various colors. This paper aims to investigate the portrayal of gender rights in various online media publications and websites. The methodology applied has been quantitative content analysis method to analyze the various aspects of media's coverage of various social and cultural changes with reference to women's rights. For the purpose of research, convenience sampling was done for eight international online articles from media websites. The articles discussed the lifting of ban for females on driving cars in Saudi Arabia as well as gender development for these women. These articles were analyzed for media frames, and various categories of analysis were developed, which highlighted the stance that was observed. Certain terms were conceptualized and operationalized and were also explained for better understanding of the context.

Keywords: gender rights, media coverage, political change, women's liberation

Procedia PDF Downloads 109
1547 Mathematical Model of Corporate Bond Portfolio and Effective Border Preview

Authors: Sergey Podluzhnyy

Abstract:

One of the most important tasks of investment and pension fund management is building decision support system which helps to make right decision on corporate bond portfolio formation. Today there are several basic methods of bond portfolio management. They are duration management, immunization and convexity management. Identified methods have serious disadvantage: they do not take into account credit risk or insolvency risk of issuer. So, identified methods can be applied only for management and evaluation of high-quality sovereign bonds. Applying article proposes mathematical model for building an optimal in case of risk and yield corporate bond portfolio. Proposed model takes into account the default probability in formula of assessment of bonds which results to more correct evaluation of bonds prices. Moreover, applied model provides tools for visualization of the efficient frontier of corporate bonds portfolio taking into account the exposure to credit risk, which will increase the quality of the investment decisions of portfolio managers.

Keywords: corporate bond portfolio, default probability, effective boundary, portfolio optimization task

Procedia PDF Downloads 318
1546 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025

Authors: Shruti Chopra, Vikas Rao Vadi

Abstract:

In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.

Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking

Procedia PDF Downloads 148
1545 Statistical Correlation between Ply Mechanical Properties of Composite and Its Effect on Structure Reliability

Authors: S. Zhang, L. Zhang, X. Chen

Abstract:

Due to the large uncertainty on the mechanical properties of FRP (fibre reinforced plastic), the reliability evaluation of FRP structures are currently receiving much attention in industry. However, possible statistical correlation between ply mechanical properties has been so far overlooked, and they are mostly assumed to be independent random variables. In this study, the statistical correlation between ply mechanical properties of uni-directional and plain weave composite is firstly analyzed by a combination of Monte-Carlo simulation and finite element modeling of the FRP unit cell. Large linear correlation coefficients between the in-plane mechanical properties are observed, and the correlation coefficients are heavily dependent on the uncertainty of the fibre volume ratio. It is also observed that the correlation coefficients related to Poisson’s ratio are negative while others are positive. To experimentally achieve the statistical correlation coefficients between in-plane mechanical properties of FRP, all concerned in-plane mechanical properties of the same specimen needs to be known. In-plane shear modulus of FRP is experimentally derived by the approach suggested in the ASTM standard D5379M. Tensile tests are conducted using the same specimens used for the shear test, and due to non-uniform tensile deformation a modification factor is derived by a finite element modeling. Digital image correlation is adopted to characterize the specimen non-uniform deformation. The preliminary experimental results show a good agreement with the numerical analysis on the statistical correlation. Then, failure probability of laminate plates is calculated in cases considering and not considering the statistical correlation, using the Monte-Carlo and Markov Chain Monte-Carlo methods, respectively. The results highlight the importance of accounting for the statistical correlation between ply mechanical properties to achieve accurate failure probability of laminate plates. Furthermore, it is found that for the multi-layer laminate plate, the statistical correlation between the ply elastic properties significantly affects the laminate reliability while the effect of statistical correlation between the ply strength is minimal.

Keywords: failure probability, FRP, reliability, statistical correlation

Procedia PDF Downloads 162
1544 A Contemporary Advertising Strategy on Social Networking Sites

Authors: M. S. Aparna, Pushparaj Shetty D.

Abstract:

Nowadays social networking sites have become so popular that the producers or the sellers look for these sites as one of the best options to target the right audience to market their products. There are several tools available to monitor or analyze the social networks. Our task is to identify the right community web pages and find out the behavior analysis of the members by using these tools and formulate an appropriate strategy to market the products or services to achieve the set goals. The advertising becomes more effective when the information of the product/ services come from a known source. The strategy explores great buying influence in the audience on referral marketing. Our methodology proceeds with critical budget analysis and promotes viral influence propagation. In this context, we encompass the vital bits of budget evaluation such as the number of optimal seed nodes or primary influential users activated onset, an estimate coverage spread of nodes and maximum influence propagating distance from an initial seed to an end node. Our proposal for Buyer Prediction mathematical model arises from the urge to perform complex analysis when the probability density estimates of reliable factors are not known or difficult to calculate. Order Statistics and Buyer Prediction mapping function guarantee the selection of optimal influential users at each level. We exercise an efficient tactics of practicing community pages and user behavior to determine the product enthusiasts on social networks. Our approach is promising and should be an elementary choice when there is little or no prior knowledge on the distribution of potential buyers on social networks. In this strategy, product news propagates to influential users on or surrounding networks. By applying the same technique, a user can search friends who are capable to advise better or give referrals, if a product interests him.

Keywords: viral marketing, social network analysis, community web pages, buyer prediction, influence propagation, budget constraints

Procedia PDF Downloads 263
1543 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 187
1542 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 176
1541 Organizational Innovations of the 20th Century as High Tech of the 21st: Evidence from Patent Data

Authors: Valery Yakubovich, Shuping wu

Abstract:

Organization theorists have long claimed that organizational innovations are nontechnological, in part because they are unpatentable. The claim rests on the assumption that organizational innovations are abstract ideas embodied in persons and contexts rather than in context-free practical tools. However, over the last three decades, organizational knowledge has been increasingly embodied in digital tools which, in principle, can be patented. To provide the first empirical evidence regarding the patentability of organizational innovations, we trained two machine learning algorithms to identify a population of 205,434 patent applications for organizational technologies (OrgTech) and, among them, 141,285 applications that use organizational innovations accumulated over the 20th century. Our event history analysis of the probability of patenting an OrgTech invention shows that ideas from organizational innovations decrease the probability of patent allowance unless they describe a practical tool. We conclude that the present-day digital transformation places organizational innovations in the realm of high tech and turns the debate about organizational technologies into the challenge of designing practical organizational tools that embody big ideas about organizing. We outline an agenda for patent-based research on OrgTech as an emerging phenomenon.

Keywords: organizational innovation, organizational technology, high tech, patents, machine learning

Procedia PDF Downloads 122
1540 Monitoring Urban Green Space Cover Change Using GIS and Remote Sensing in Two Rapidly Urbanizing Cities, Debre Berhan and Debre Markos, Ethiopia

Authors: Alemaw Kefale, Aramde Fetene, Hayal Desta

Abstract:

Monitoring the amount of green space in urban areas is important for ensuring sustainable development and proper management. The study analyzed changes in urban green space coverage over the past 20 years in two rapidly urbanizing cities in Ethiopia, Debre Berhan and Debre Markos, using GIS and remote sensing. The researchers used Landsat 5 and 8 data with a spatial resolution of 30 m to determine different land use and land cover classes, including urban green spaces, barren and croplands, built-up areas, and water bodies. The classification accuracy ranged between 90% and 91.4%, with a Kappa Statistic of 0.85 to 0.88. The results showed that both cities experienced significant decreases in vegetation cover in their urban cores between 2000 and 2020, with radical changes observed from green spaces and croplands to built-up areas. In Debre Berhan, barren and croplands decreased by 32.96%, while built-up and green spaces increased by 357.9% and 37.4%, respectively, in 2020. In Debre Markos, built-up areas increased by 224.2%, while green spaces and barren and croplands decreased by 41% and 5.71%, respectively. The spatial structure of cities and planning policies were noticed as the major factors for big green cover change. Thus it has an implication for other rapidly urbanized cities in Africa and Asia. Overall, rapid urbanization threatens green spaces and agricultural areas, highlighting the need for ecological-based spatial planning in rapidly urbanizing cities.

Keywords: green space coverage, GIS and remote sensing, Landsat, LULC, Ethiopia

Procedia PDF Downloads 57
1539 The Implementation of Level of Service for Development of Kuala Lumpur Transit Information System using GIS

Authors: Mokhtar Azizi

Abstract:

Due to heavy traffic and congested roads, it is crucial that the most popular main public transport services in Kuala Lumpur i.e. Putra LRT, Star LRT, KTM Commuter, KL Monorail and Rapid Bus must be continuously monitored and improved to fulfill the rider’s requirement and kept updated by the transit agencies. Evaluation on the current status of the services has been determined out by calculating the transit supportive area (TSA) and level of service (LOS) for each transit station. This research study has carried out the TSA and LOS mapping based on GIS techniques. The detailed census data of the region along the line of services has been collected from the Department of Statistics Malaysia for this purpose. The service coverage has been decided by 400 meters buffer zone for bus stations and 800 meters for rails station and railways in measurement the Quality of Service along the line of services. All the required information has been calculated by using the customized GIS software called Kuala Lumpur Transit Information System (KLTIS). The transit supportive area was calculated with the employment density at least 10 job/hectare or household density at 7.5 unit/hectare and total area covered by transit supportive area is 22516 hectare and the total area that is not supported by transit is 1718 hectare in Kuala Lumpur. The level of service is calculated with the percentage of transit supportive area served by transit for each station. In overall the percentage transit supportive areas served by transit for all the stations were less than 50% which falls in a very low level of service category. This research has proven its benefit by providing the current transit services operators with vital information for improvement of existing public transport services.

Keywords: service coverage, transit supportive area, level of service, transit system

Procedia PDF Downloads 377
1538 Determinants of Income Diversification among Support Zone Communities of National Parks in Nigeria

Authors: Daniel Etim Jacob, Samuel Onadeko, Edem A. Eniang, Imaobong Ufot Nelson

Abstract:

This paper examined determinants of income diversification among households in support zones communities of national parks in Nigeria. This involved the use household data collected through questionnaires administered randomly among 1009 household heads in the study area. The data obtained were analyzed using probability and non-probability statistical analysis such as regression and analysis of variance to test for mean difference between parks. The result obtained indicates that majority of the household heads were male (92.57%0, between the age class of 21 – 40 years (44.90%), had non-formal education (38.16%), were farmers (65.21%), owned land (95.44%), with a household size of 1 – 5 (36.67%) and an annual income range of ₦401,000 - ₦600,000 (24.58%). Mean Simpson index of diversity showed a general low (0.375) level of income diversification among the households. Income, age, off-farm dependence, education, household size and occupation where significant (p<0.01) factors that affected households’ income diversification. The study recommends improvement in the existing infrastructures and social capital in the communities as avenues to improve the livelihood and ensure positive conservation behaviors in the study area.

Keywords: income diversification, protected area, livelihood, poverty, Nigeria

Procedia PDF Downloads 143
1537 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks

Authors: Ahmed Abdullah Ahmed

Abstract:

The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.

Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments

Procedia PDF Downloads 513
1536 Universal Health Coverage 2019 in Indonesia: The Integration of Family Planning Services in Current Functioning Health System

Authors: Fathonah Siti, Ardiana Irma

Abstract:

Indonesia is currently on its track to achieve Universal Health Coverage (UHC) by 2019. The program aims to address issues on disintegration in the implementation and coverage of various health insurance schemes and fragmented fund pooling. Family planning service is covered as one of benefit packages under preventive care. However, little has been done to examine how family planning program are appropriately managed across levels of governments and how family planning services are delivered to the end user. The study is performed through focus group discussion to related policy makers and selected programmers at central and district levels. The study is also benefited from relevant studies on family planning in the UHC scheme and other supporting data. The study carefully investigates some programmatic implications when family planning is integrated in the UHC program encompassing the need to recalculate contraceptive logistics for beneficiaries (eligible couple); policy reformulation for contraceptive service provision including supply chain management; establishment of family planning standard of procedure; and a call to update Management Information System. The study confirms that there is a significant increase in the numbers of contraceptive commodities needs to be procured by the government. Holding an assumption that contraceptive prevalence rate and commodities cost will be as expected increasing at 0.5% annually, the government need to allocate almost IDR 5 billion by 2019, excluded fee for service. The government shifts its focus to maintain eligible health facilities under National Population and Family Planning Board networks. By 2019, the government has set strategies to anticipate the provision of family planning services to 45.340 health facilities distributed in 514 districts and 7 thousand sub districts. Clear division of authorities has been established among levels of governments. Three models of contraceptive supply planning have been developed and currently in the process of being institutionalized. Pre service training for family planning services has been piloted in 10 prominent universities. The position of private midwives has been appreciated as part of the system. To ensure the implementation of quality and health expenditure control, family planning standard has been established as a reference to determine set of services required to deliver to the clients properly and types of health facilities to conduct particular family planning services. Recognition to individual status of program participation has been acknowledged in the Family Enumeration since 2015. The data is precisely recorded by name by address for each family and its members. It supplies valuable information to 15.131 Family Planning Field Workers (FPFWs) to provide information and education related to family planning in an attempt to generate demand and maintain the participation of family planning acceptors who are program beneficiaries. Despite overwhelming efforts described above, some obstacles remain. The program experiences poor socialization and yet removes geographical barriers for those living in remote areas. Family planning services provided for this sub population conducted outside the scheme as a complement strategy. However, UHC program has brought remarkable improvement in access and quality of family planning services.

Keywords: beneficiary, family planning services, national population and family planning board, universal health coverage

Procedia PDF Downloads 190
1535 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 126
1534 Modeling of Anode Catalyst against CO in Fuel Cell Using Material Informatics

Authors: M. Khorshed Alam, H. Takaba

Abstract:

The catalytic properties of metal usually change by intermixturing with another metal in polymer electrolyte fuel cells. Pt-Ru alloy is one of the much-talked used alloy to enhance the CO oxidation. In this work, we have investigated the CO coverage on the Pt2Ru3 nanoparticle with different atomic conformation of Pt and Ru using a combination of material informatics with computational chemistry. Density functional theory (DFT) calculations used to describe the adsorption strength of CO and H with different conformation of Pt Ru ratio in the Pt2Ru3 slab surface. Then through the Monte Carlo (MC) simulations we examined the segregation behaviour of Pt as a function of surface atom ratio, subsurface atom ratio, particle size of the Pt2Ru3 nanoparticle. We have constructed a regression equation so as to reproduce the results of DFT only from the structural descriptors. Descriptors were selected for the regression equation; xa-b indicates the number of bonds between targeted atom a and neighboring atom b in the same layer (a,b = Pt or Ru). Terms of xa-H2 and xa-CO represent the number of atoms a binding H2 and CO molecules, respectively. xa-S is the number of atom a on the surface. xa-b- is the number of bonds between atom a and neighboring atom b located outside the layer. The surface segregation in the alloying nanoparticles is influenced by their component elements, composition, crystal lattice, shape, size, nature of the adsorbents and its pressure, temperature etc. Simulations were performed on different size (2.0 nm, 3.0 nm) of nanoparticle that were mixing of Pt and Ru atoms in different conformation considering of temperature range 333K. In addition to the Pt2Ru3 alloy we also considered pure Pt and Ru nanoparticle to make comparison of surface coverage by adsorbates (H2, CO). Hence, we assumed the pure and Pt-Ru alloy nanoparticles have an fcc crystal structures as well as a cubo-octahedron shape, which is bounded by (111) and (100) facets. Simulations were performed up to 50 million MC steps. From the results of MC, in the presence of gases (H2, CO), the surfaces are occupied by the gas molecules. In the equilibrium structure the coverage of H and CO as a function of the nature of surface atoms. In the initial structure, the Pt/Ru ratios on the surfaces for different cluster sizes were in range of 0.50 - 0.95. MC simulation was employed when the partial pressure of H2 (PH2) and CO (PCO) were 70 kPa and 100-500 ppm, respectively. The Pt/Ru ratios decrease as the increase in the CO concentration, without little exception only for small nanoparticle. The adsorption strength of CO on the Ru site is higher than the Pt site that would be one of the reason for decreasing the Pt/Ru ratio on the surface. Therefore, our study identifies that controlling the nanoparticle size, composition, conformation of alloying atoms, concentration and chemical potential of adsorbates have impact on the steadiness of nanoparticle alloys which ultimately and also overall catalytic performance during the operations.

Keywords: anode catalysts, fuel cells, material informatics, Monte Carlo

Procedia PDF Downloads 193
1533 Mobile Number Portability

Authors: R. Geetha, J. Arunkumar, P. Gopal, D. Loganathan, K. Pavithra, C. Vikashini

Abstract:

Mobile Number Portability is an attempt to switch over from one network to another network facility for mobile based on applications. This facility is currently not available for mobile handsets. This application is intended to assist the mobile network and its service customers in understanding the criteria; this will serve as a universal set of requirements which must be met by the customers. This application helps the user's network portability. Accessing permission from the network provider to enable services to the user and utilizing the available network signals. It is enabling the user to make a temporary switch over to other network. The main aim of this research work is to adapt multiple networks at the time of no network coverage. It can be accessed at rural and geographical areas. This can be achieved by this mobile application. The application is capable of temporary switch over between various networks. With this application both the service provider and the network user are benefited. The service provider is benefited by charging a minimum cost for utilizing other network. It provides security in terms of password that is unique to avoid unauthorized users and to prevent loss of balance. The goal intended to be attained is a complete utilization of available network at significant situations and to provide feature that satisfy the customer needs. The temporary switch over is done to manage emergency calls when user is in rural or geographical area, where there will be a very low network coverage. Since people find it trend in using Android mobile, this application is designed as an Android applications, which can be freely downloaded and installed from Play store. In the current scenario, the service provider enables the user to change their network without shifting their mobile network. This application affords a clarification for users while they are jammed in a critical situation. This application is designed by using Android 4.2 and SQLite Version3.

Keywords: mobile number, random number, alarm, imei number, call

Procedia PDF Downloads 363
1532 Modal Approach for Decoupling Damage Cost Dependencies in Building Stories

Authors: Haj Najafi Leila, Tehranizadeh Mohsen

Abstract:

Dependencies between diverse factors involved in probabilistic seismic loss evaluation are recognized to be an imperative issue in acquiring accurate loss estimates. Dependencies among component damage costs could be taken into account considering two partial distinct states of independent or perfectly-dependent for component damage states; however, in our best knowledge, there is no available procedure to take account of loss dependencies in story level. This paper attempts to present a method called "modal cost superposition method" for decoupling story damage costs subjected to earthquake ground motions dealt with closed form differential equations between damage cost and engineering demand parameters which should be solved in complex system considering all stories' cost equations by the means of the introduced "substituted matrixes of mass and stiffness". Costs are treated as probabilistic variables with definite statistic factors of median and standard deviation amounts and a presumed probability distribution. To supplement the proposed procedure and also to display straightforwardness of its application, one benchmark study has been conducted. Acceptable compatibility has been proven for the estimated damage costs evaluated by the new proposed modal and also frequently used stochastic approaches for entire building; however, in story level, insufficiency of employing modification factor for incorporating occurrence probability dependencies between stories has been revealed due to discrepant amounts of dependency between damage costs of different stories. Also, more dependency contribution in occurrence probability of loss could be concluded regarding more compatibility of loss results in higher stories than the lower ones, whereas reduction in incorporation portion of cost modes provides acceptable level of accuracy and gets away from time consuming calculations including some limited number of cost modes in high mode situation.

Keywords: dependency, story-cost, cost modes, engineering demand parameter

Procedia PDF Downloads 181
1531 A Theoretical Approach on Electoral Competition, Lobby Formation and Equilibrium Policy Platforms

Authors: Deepti Kohli, Meeta Keswani Mehra

Abstract:

The paper develops a theoretical model of electoral competition with purely opportunistic candidates and a uni-dimensional policy using the probability voting approach while focusing on the aspect of lobby formation to analyze the inherent complex interactions between centripetal and centrifugal forces and their effects on equilibrium policy platforms. There exist three types of agents, namely, Left-wing, Moderate and Right-wing who comprise of the total voting population. Also, it is assumed that the Left and Right agents are free to initiate a lobby of their choice. If initiated, these lobbies generate donations which in turn can be contributed to one (or both) electoral candidates in order to influence them to implement the lobby’s preferred policy. Four different lobby formation scenarios have been considered: no lobby formation, only Left, only Right and both Left and Right. The equilibrium policy platforms, amount of individual donations by agents to their respective lobbies and the contributions offered to the electoral candidates have been solved for under each of the above four cases. Since it is assumed that the agents cannot coordinate each other’s actions during the lobby formation stage, there exists a probability with which a lobby would be formed, which is also solved for in the model. The results indicate that the policy platforms of the two electoral candidates converge completely under the cases of no lobby and both (extreme) formations but diverge under the cases of only one (Left or Right) lobby formation. This is because in the case of no lobby being formed, only the centripetal forces (emerging from the election-winning aspect) are present while in the case of both extreme (Left-wing and Right-wing) lobbies being formed, centrifugal forces (emerging from the lobby formation aspect) also arise but cancel each other out, again resulting in a pure policy convergence phenomenon. In contrast, in case of only one lobby being formed, both centripetal and centrifugal forces interact strategically, leading the two electoral candidates to choose completely different policy platforms in equilibrium. Additionally, it is found that in equilibrium, while the donation by a specific agent type increases with the formation of both lobbies in comparison to when only one lobby is formed, the probability of implementation of the policy being advocated by that lobby group falls.

Keywords: electoral competition, equilibrium policy platforms, lobby formation, opportunistic candidates

Procedia PDF Downloads 334
1530 Damping Optimal Design of Sandwich Beams Partially Covered with Damping Patches

Authors: Guerich Mohamed, Assaf Samir

Abstract:

The application of viscoelastic materials in the form of constrained layers in mechanical structures is an efficient and cost-effective technique for solving noise and vibration problems. This technique requires a design tool to select the best location, type, and thickness of the damping treatment. This paper presents a finite element model for the vibration of beams partially or fully covered with a constrained viscoelastic damping material. The model is based on Bernoulli-Euler theory for the faces and Timoshenko beam theory for the core. It uses four variables: the through-thickness constant deflection, the axial displacements of the faces, and the bending rotation of the beam. The sandwich beam finite element is compatible with the conventional C1 finite element for homogenous beams. To validate the proposed model, several free vibration analyses of fully or partially covered beams, with different locations of the damping patches and different percent coverage, are studied. The results show that the proposed approach can be used as an effective tool to study the influence of the location and treatment size on the natural frequencies and the associated modal loss factors. Then, a parametric study regarding the variation in the damping characteristics of partially covered beams has been conducted. In these studies, the effect of core shear modulus value, the effect of patch size variation, the thickness of constraining layer, and the core and the locations of the patches are considered. In partial coverage, the spatial distribution of additive damping by using viscoelastic material is as important as the thickness and material properties of the viscoelastic layer and the constraining layer. Indeed, to limit added mass and to attain maximum damping, the damping patches should be placed at optimum locations. These locations are often selected using the modal strain energy indicator. Following this approach, the damping patches are applied over regions of the base structure with the highest modal strain energy to target specific modes of vibration. In the present study, a more efficient indicator is proposed, which consists of placing the damping patches over regions of high energy dissipation through the viscoelastic layer of the fully covered sandwich beam. The presented approach is used in an optimization method to select the best location for the damping patches as well as the material thicknesses and material properties of the layers that will yield optimal damping with the minimum area of coverage.

Keywords: finite element model, damping treatment, viscoelastic materials, sandwich beam

Procedia PDF Downloads 148
1529 Reliability Analysis of Glass Epoxy Composite Plate under Low Velocity

Authors: Shivdayal Patel, Suhail Ahmad

Abstract:

Safety assurance and failure prediction of composite material component of an offshore structure due to low velocity impact is essential for associated risk assessment. It is important to incorporate uncertainties associated with material properties and load due to an impact. Likelihood of this hazard causing a chain of failure events plays an important role in risk assessment. The material properties of composites mostly exhibit a scatter due to their in-homogeneity and anisotropic characteristics, brittleness of the matrix and fiber and manufacturing defects. In fact, the probability of occurrence of such a scenario is due to large uncertainties arising in the system. Probabilistic finite element analysis of composite plates due to low-velocity impact is carried out considering uncertainties of material properties and initial impact velocity. Impact-induced damage of composite plate is a probabilistic phenomenon due to a wide range of uncertainties arising in material and loading behavior. A typical failure crack initiates and propagates further into the interface causing de-lamination between dissimilar plies. Since individual crack in the ply is difficult to track. The progressive damage model is implemented in the FE code by a user-defined material subroutine (VUMAT) to overcome these problems. The limit state function is accordingly established while the stresses in the lamina are such that the limit state function (g(x)>0). The Gaussian process response surface method is presently adopted to determine the probability of failure. A comparative study is also carried out for different combination of impactor masses and velocities. The sensitivity based probabilistic design optimization procedure is investigated to achieve better strength and lighter weight of composite structures. Chain of failure events due to different modes of failure is considered to estimate the consequences of failure scenario. Frequencies of occurrence of specific impact hazards yield the expected risk due to economic loss.

Keywords: composites, damage propagation, low velocity impact, probability of failure, uncertainty modeling

Procedia PDF Downloads 279
1528 The Use of STIMULAN Resorbable Antibiotic Beads in Conjunction with Autologous Tissue Transfer to Treat Recalcitrant Infections and Osteomyelitis in Diabetic Foot Wounds

Authors: Hayden R Schott, John M Felder III

Abstract:

Introduction: Chronic lower extremity wounds in the diabetic and vasculopathic populations are associated with a high degree of morbidity.When wounds require more extensive treatment than can be offered by wound care centers, more aggressive solutions involve local tissue transfer and microsurgical free tissue transfer for achieving definitive soft tissue coverage. These procedures of autologous tissue transfer (ATT) offer resilient, soft tissue coverage of limb-threatening wounds and confer promising limb salvage rates. However, chronic osteomyelitis and recalcitrant soft tissue infections are common in severe diabetic foot wounds and serve to significantly complicate ATT procedures. Stimulan is a resorbable calcium sulfate antibiotic carrier. The use of stimulan antibiotic beads to treat chronic osteomyelitis is well established in the orthopedic and plastic surgery literature. In these procedures, the beads are placed beneath the skin flap to directly deliver antibiotics to the infection site. The purpose of this study was to quantify the success of Stimulan antibiotic beads in treating recalcitrant infections in patients with diabetic foot wounds receiving ATT. Methods: A retrospective review of clinical and demographic information was performed on patients who underwent ATT with the placement of Stimulan antibiotic beads for attempted limb salvage from 2018-21. Patients were analyzed for preoperative wound characteristics, demographics, infection recurrence, and adverse outcomes as a result of product use. The primary endpoint was 90 day infection recurrence, with secondary endpoints including 90 day complications. Outcomes were compared using basic statistics and Fisher’s exact tests. Results: In this time span, 14 patients were identified. At the time of surgery, all patients exhibited clinical signs of active infection, including positive cultures and erythema. 57% of patients (n=8) exhibited chronic osteomyelitis prior to surgery, and 71% (n=10) had exposed bone at the wound base. In 57% of patients (n=8), Stimulan beads were placed beneath a free tissue flap and beneath a pedicle tissue flap in 42% of patients (n=6). In all patients, Stimulan beads were only applied once. Recurrent infections were observed in 28% of patients (n=4) at 90 days post-op, and flap nonadherence was observed in 7% (n=1). These were the only Stimulan related complications observed. Ultimately, lower limb salvage was successful in 85% of patients (n=12). Notably, there was no significant association between the preoperative presence of osteomyelitis and recurrent infections. Conclusions: The use of Stimulanantiobiotic beads to treat recalcitrant infections in patients receiving definitive skin coverage of diabetic foot wounds does not appear to demonstrate unnecessary risk. Furthermore, the lack of significance between the preoperative presence of osteomyelitis and recurrent infections indicates the successful use of Stimulan to dampen infection in patients with osteomyelitis, as is consistent with the literature. Further research is needed to identify Stimulan as the significant contributor to infection treatment using future cohort and case control studies with more patients. Nonetheless, the use of Stimulan antibiotic beads in patients with diabetic foot wounds demonstrates successful infection suppression and maintenance of definitive soft tissue coverage.

Keywords: wound care, stimulan antibiotic beads, free tissue transfer, plastic surgery, wound, infection

Procedia PDF Downloads 91
1527 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.

Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series

Procedia PDF Downloads 96
1526 Social Networks Global Impact on Protest Movements and Human Rights Activism

Authors: Marcya Burden, Savonna Greer

Abstract:

In the wake of social unrest around the world, protest movements have been captured like never before. As protest movements have evolved, so too have their visibility and sources of coverage. Long gone are the days of print media as our only glimpse into the action surrounding a protest. Now, with social networks such as Facebook, Instagram and Snapchat, we have access to real-time video footage of protest movements and human rights activism that can reach millions of people within seconds. This research paper investigated various social media network platforms’ statistical usage data in the areas of human rights activism and protest movements, paralleling with other past forms of media coverage. This research demonstrates that social networks are extremely important to protest movements and human rights activism. With over 2.9 billion users across social media networks globally, these platforms are the heart of most recent protests and human rights activism. This research shows the paradigm shift from the Selma March of 1965 to the more recent protests of Ferguson in 2014, Ni Una Menos in 2015, and End Sars in 2018. The research findings demonstrate that today, almost anyone may use their social networks to protest movement leaders and human rights activists. From a student to an 80-year-old professor, the possibility of reaching billions of people all over the world is limitless. Findings show that 82% of the world’s internet population is on social networks 1 in every 5 minutes. Over 65% of Americans believe social media highlights important issues. Thus, there is no need to have a formalized group of people or even be known online. A person simply needs to be engaged on their respective social media networks (Facebook, Twitter, Instagram, Snapchat) regarding any cause they are passionate about. Information may be exchanged in real time around the world and a successful protest can begin.

Keywords: activism, protests, human rights, networks

Procedia PDF Downloads 96
1525 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object

Procedia PDF Downloads 234
1524 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach

Authors: M. Bahari Mehrabani, Hua-Peng Chen

Abstract:

Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.

Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling

Procedia PDF Downloads 235
1523 Implicit Transaction Costs and the Fundamental Theorems of Asset Pricing

Authors: Erindi Allaj

Abstract:

This paper studies arbitrage pricing theory in financial markets with transaction costs. We extend the existing theory to include the more realistic possibility that the price at which the investors trade is dependent on the traded volume. The investors in the market always buy at the ask and sell at the bid price. Transaction costs are composed of two terms, one is able to capture the implicit transaction costs and the other the price impact. Moreover, a new definition of a self-financing portfolio is obtained. The self-financing condition suggests that continuous trading is possible, but is restricted to predictable trading strategies which have left and right limit and finite quadratic variation. That is, predictable trading strategies of infinite variation and of finite quadratic variation are allowed in our setting. Within this framework, the existence of an equivalent probability measure is equivalent to the absence of arbitrage opportunities, so that the first fundamental theorem of asset pricing (FFTAP) holds. It is also proved that, when this probability measure is unique, any contingent claim in the market is hedgeable in an L2-sense. The price of any contingent claim is equal to the risk-neutral price. To better understand how to apply the theory proposed we provide an example with linear transaction costs.

Keywords: arbitrage pricing theory, transaction costs, fundamental theorems of arbitrage, financial markets

Procedia PDF Downloads 361
1522 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 202
1521 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 383