Search results for: generalized estimators
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 832

Search results for: generalized estimators

262 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 45
261 MLProxy: SLA-Aware Reverse Proxy for Machine Learning Inference Serving on Serverless Computing Platforms

Authors: Nima Mahmoudi, Hamzeh Khazaei

Abstract:

Serving machine learning inference workloads on the cloud is still a challenging task at the production level. The optimal configuration of the inference workload to meet SLA requirements while optimizing the infrastructure costs is highly complicated due to the complex interaction between batch configuration, resource configurations, and variable arrival process. Serverless computing has emerged in recent years to automate most infrastructure management tasks. Workload batching has revealed the potential to improve the response time and cost-effectiveness of machine learning serving workloads. However, it has not yet been supported out of the box by serverless computing platforms. Our experiments have shown that for various machine learning workloads, batching can hugely improve the system’s efficiency by reducing the processing overhead per request. In this work, we present MLProxy, an adaptive reverse proxy to support efficient machine learning serving workloads on serverless computing systems. MLProxy supports adaptive batching to ensure SLA compliance while optimizing serverless costs. We performed rigorous experiments on Knative to demonstrate the effectiveness of MLProxy. We showed that MLProxy could reduce the cost of serverless deployment by up to 92% while reducing SLA violations by up to 99% that can be generalized across state-of-the-art model serving frameworks.

Keywords: serverless computing, machine learning, inference serving, Knative, google cloud run, optimization

Procedia PDF Downloads 140
260 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: TThailand tourism, Maximum Entropy Bootstrapping approach, macroeconomic model, asymmetric information

Procedia PDF Downloads 271
259 Effectiveness of Using Phonemic Awareness Based Activities in Improving Decoding Skills of Third Grade Students Referred for Reading Disabilities in Oman

Authors: Mahmoud Mohamed Emam

Abstract:

In Oman the number of students referred for reading disabilities is on the rise. Schools serve these students by placement in the so-called learning disabilities unit. Recently the author led a strategic project to train teachers on the use of curriculum based measurement to identify students with reading disabilities in Oman. Additional the project involved training teachers to use phonemic awareness based activities to improve reading skills of those students. Phonemic awareness refers to the ability to notice, think about, and work with the individual sounds in words. We know that a student's skill in phonemic awareness is a good predictor of later reading success or difficulty. Using multiple baseline design across four participants the current studies investigated the effectiveness of using phonemic awareness based activities to improve decoding skills of third grade students referred for reading disabilities in Oman. During treatment students received phonemic awareness based activities that were designed to fulfill the idiosyncratic characteristics of Arabic language phonology as well as orthography. Results indicated that the phonemic awareness based activities were effective in substantially increasing the number of correctly decoded word for all four participants. Maintenance of strategy effects was evident for the weeks following the termination of intervention for the four students. In addition, the effects of intervention generalized to decoding novel words for all four participants.

Keywords: learning disabilities, phonemic awareness, third graders, Oman

Procedia PDF Downloads 617
258 Lennox-gastaut Syndrome Associated with Dysgenesis of Corpus Callosum

Authors: A. Bruce Janati, Muhammad Umair Khan, Naif Alghassab, Ibrahim Alzeir, Assem Mahmoud, M. Sammour

Abstract:

Rationale: Lennox-Gastaut syndrome(LGS) is an electro-clinical syndrome composed of the triad of mental retardation, multiple seizure types, and the characteristic generalized slow spike-wave complexes in the EEG. In this article, we report on two patients with LGS whose brain MRI showed dysgenesis of corpus callosum(CC). We review the literature and stress the role of CC in the genesis of secondary bilateral synchrony(SBS). Method: This was a clinical study conducted at King Khalid Hospital. Results: The EEG was consistent with LGS in patient 1 and unilateral slow spike-wave complexes in patient 2. The MRI showed hypoplasia of the splenium of CC in patient 1, and global hypoplasia of CC combined with Joubert syndrome in patient 2. Conclusion: Based on the data, we proffer the following hypotheses: 1-Hypoplasia of CC interferes with functional integrity of this structure. 2-The genu of CC plays a pivotal role in the genesis of secondary bilateral synchrony. 3-Electrodecremental seizures in LGS emanate from pacemakers generated in the brain stem, in particular the mesencephalon projecting abnormal signals to the cortex via thalamic nuclei. 4-Unilateral slow spike-wave complexes in the context of mental retardation and multiple seizure types may represent a variant of LGS, justifying neuroimaging studies.

Keywords: EEG, Lennox-Gastaut syndrome, corpus callosum , MRI

Procedia PDF Downloads 410
257 The Role of Human Capital in the Evolution of Inequality and Economic Growth in Latin-America

Authors: Luis Felipe Brito-Gaona, Emma M. Iglesias

Abstract:

There is a growing literature that studies the main determinants and drivers of inequality and economic growth in several countries, using panel data and different estimation methods (fixed effects, Generalized Methods of Moments (GMM) and Two Stages Least Squares (TSLS)). Recently, it was studied the evolution of these variables in the period 1980-2009 in the 18 countries of Latin-America and it was found that one of the main variables that explained their evolution was Foreign Direct Investment (FDI). We extend this study to the year 2015 in the same 18 countries in Latin-America, and we find that FDI does not have a significant role anymore, while we find a significant negative and positive effect of schooling levels on inequality and economic growth respectively. We also find that the point estimates associated with human capital are the largest ones of the variables included in the analysis, and this means that an increase in human capital (measured by schooling levels of secondary education) is the main determinant that can help to reduce inequality and to increase economic growth in Latin-America. Therefore, we advise that economic policies in Latin-America should be directed towards increasing the level of education. We use the methodologies of estimating by fixed effects, GMM and TSLS to check the robustness of our results. Our conclusion is the same regardless of the estimation method we choose. We also find that the international recession in the Latin-American countries in 2008 reduced significantly their economic growth.

Keywords: economic growth, human capital, inequality, Latin-America

Procedia PDF Downloads 200
256 Insecurity, Instability and Lack of Benefits: Factors Reasonable for Poor Performance among “Contract Workers” in South Africa

Authors: Charmaine Devinee Pillay

Abstract:

Employees in both public and private sectors are expected to contribute significantly to the growth and development of the organization that employs them. Good working conditions are directly linked to the optimum output emanating from the workforce’s excellent performance. Insecurity, instability and lack of benefits negatively impact on the employees’ commitment to their job. This is a qualitative case study that comprised 40 “Contract Employees” (Academic and Supporting staff) in the Faculty of Health Sciences, Walter Sisulu University, Mthatha, Eastern Cape, South Africa. Questionnaire, as instrument of data collection, was used to obtain qualitative data. Data collected were categorized in themes and sub-themes for analyses and discussion. Findings showed that “contract Employees” are highly demoralized due to job insecurity and non-benefits, among other factors, which directly affect their overall output in discharging their duties. The case study at Walter Sisulu University typifies the generalized challenges faced by workers on contract basis in South Africa. It is therefore, recommended that employers hire their workforce on permanent basis or, where “Contract Employment “is inevitable, similar conditions that go with permanent employment should be incorporated in the contract terms of “Contract Employees”. This serves as impetus for optimum performance.

Keywords: contract employee, insecurity, instability, risk factors

Procedia PDF Downloads 171
255 Generalized Limit Equilibrium Solution for the Lateral Pile Capacity Problem

Authors: Tomer Gans-Or, Shmulik Pinkert

Abstract:

The determination of lateral pile capacity per unit length is a key aspect in geotechnical engineering. Traditional approaches for assessing piles lateral capacity in cohesive soils involve the application of upper-bound and lower-bound plasticity theorems. However, a comprehensive solution encompassing the entire spectrum of soil strength parameters, particularly in frictional soils with or without cohesion, is still lacking. This research introduces an innovative implementation of the slice method limit equilibrium solution for lateral capacity assessment. For any given numerical discretization of the soil's domain around the pile, the lateral capacity evaluation is based on mobilized strength concept. The critical failure geometry is then found by a unique optimization procedure which includes both factor of safety minimization and geometrical optimization. The robustness of this suggested methodology is that the solution is independent of any predefined assumptions. Validation of the solution is accomplished through a comparison with established plasticity solutions for cohesive soils. Furthermore, the study demonstrates the applicability of the limit equilibrium method to address unresolved cases related to frictional and cohesive-frictional soils. Beyond providing capacity values, the method enables the utilization of the mobilized strength concept to generate safety-factor distributions for scenarios representing pre-failure states.

Keywords: lateral pile capacity, slice method, limit equilibrium, mobilized strength

Procedia PDF Downloads 29
254 Implications of Climate Change and World Uncertainty for Gender Inequality: Global Evidence

Authors: Kashif Nesar Rather, Mantu Kumar Mahalik

Abstract:

The discourse surrounding climate change has gained considerable traction, with a discernible emphasis on its nuanced and consequential impact on gender inequality. Concurrently, escalating global tensions are contributing to heightened uncertainty, potentially exerting influence on gender disparities. Within this framework, this study attempts to empirically investigate the implications of climate change and world uncertainty on the gender inequality for a balanced panel of 100 economies between 1995 to 2021. The estimated models also control for the effects of globalisation, economic growth, and education expenditure. The panel cointegration tests establish a significant long-run relationship between the variables of the study. Furthermore, the PMG-ARDL (Panel mean group-Autoregressive distributed lag model) estimation technique confirms that both climate change and world uncertainty perpetuate the global gender inequalities. Additionally, the results establish that globalisation, economic growth, and education expenditure exert a mitigating influence on gender inequality, signifying their role in diminishing gender disparities. These findings are further confirmed by the FGLS (Feasible Generalized Least Squares) and DKSE (Driscoll-Kraay Standard Errors) regression methods. Potential policy implications for mitigating the detrimental gender ramifications stemming from climate change and rising world uncertainties are also discussed.

Keywords: gender inequality, world uncertainty, climate change, globalisation., ecological footprint

Procedia PDF Downloads 5
253 Predicting Dose Level and Length of Time for Radiation Exposure Using Gene Expression

Authors: Chao Sima, Shanaz Ghandhi, Sally A. Amundson, Michael L. Bittner, David J. Brenner

Abstract:

In a large-scale radiologic emergency, potentially affected population need to be triaged efficiently using various biomarkers where personal dosimeters are not likely worn by the individuals. It has long been established that radiation injury can be estimated effectively using panels of genetic biomarkers. Furthermore, the rate of radiation, in addition to dose of radiation, plays a major role in determining biological responses. Therefore, a better and more accurate triage involves estimating both the dose level of the exposure and the length of time of that exposure. To that end, a large in vivo study was carried out on mice with internal emitter caesium-137 (¹³⁷Cs). Four different injection doses of ¹³⁷Cs were used: 157.5 μCi, 191 μCi, 214.5μCi, and 259 μCi. Cohorts of 6~7 mice from the control arm and each of the dose levels were sacrificed, and blood was collected 2, 3, 5, 7 and 14 days after injection for microarray RNA gene expression analysis. Using a generalized linear model with penalized maximum likelihood, a panel of 244 genes was established and both the doses of injection and the number of days after injection were accurately predicted for all 155 subjects using this panel. This has proven that microarray gene expression can be used effectively in radiation biodosimetry in predicting both the dose levels and the length of exposure time, which provides a more holistic view on radiation exposure and helps improving radiation damage assessment and treatment.

Keywords: caesium-137, gene expression microarray, multivariate responses prediction, radiation biodosimetry

Procedia PDF Downloads 174
252 Effect of PMMA Shield on the Patient Dose Equivalent from Photoneutrons Produced by High Energy Medical Linacs

Authors: Seyed Mehdi Hashemi, Gholamreza Raisali, Mehran Taheri

Abstract:

One of the important problems of using high energy linacs at IMRT is the production of photoneutrons. Besides the clinically useful photon beams, high-energy photon beams from medical linacs produce secondary neutrons. These photoneutrons increase the patient dose and may cause secondary malignancies. The effect of the shield on the reduction of photoneutron dose equivalent produced by a high energy medical linac at the patient plane is investigated in this study. To determine the photoneutron dose equivalent received to the patient a Varian linac working at 18 MV photon mode investigated. Photoneutron dose equivalent measured with Polycarbonate films of 0.25 mm thick. PC films placed at distances of 0, 10, 20, and 50 cm from the center of X-ray field on the patient couch. The results show that by increasing the distance from the center of the X-ray beam towards the periphery, the photoneutron dose equivalent decreases rapidly for both open and shielded fields and that by inserting the shield in the path of the X-ray beam, the photoneutron dose equivalent was decreased obviously compared to open field. Results show the shield, significantly reduces photoneutron dose equivalent to the patient. Results can be readily generalized to other models of medical linacs. It may be concluded that using this kind of shield can help more safe, inexpensive and efficient employment of high energy linacs in radiotherapy and IMRT.

Keywords: photoneutron, Linac, PMMA shield, equivalent dose

Procedia PDF Downloads 455
251 Budd-Chiari Syndrome: Common Presentation, Rare Disease

Authors: Aadil Khan, Yasser Chomayil, P. P. Venugopalan

Abstract:

Background: Budd-Chiari syndrome is caused by thrombosis of the hepatic veins and/or the thrombosis of the intrahepatic or suprahepatic IVC. The etiology remains idiopathic in 16% -35% of cases. Malignancy, rheumatological disorder, myeloproliferative disease, inheritable coagulopathy, infection or hyperestrogen state can be identified in many cases. Methodology: Review of case records of the patient presented to Aster Medcity, Emergency Department, Cochin. Introduction:17 years old female was presented to ED with fever, jaundice and abdominal distention since 1 week. O/E: Pallor+, icterus+. Abdomen- gross distension+, shifting dullness+, generalized anasarca+. USG abdomen showed hepatomegaly with mild coarse echotexture and moderate to gross ascites. CT abdomen and chest showed hepatomegaly with thrombosis of all three hepatic vein and moderate ascites suggestive of Budd-Chiari syndrome. Patient was taken for catheter vein thrombolysis. Venogram done the next day revealed almost > 50% opening of the right hepatic vein. Concurrent doppler showed colour and doppler signals in middle hepatic veins. She gradually improved and was discharged home on anticoagulant and adviced regular follow up. Conclusion: Being a rare disease in this young population, high suspicion is required when evaluating young patients with abdominal pain and jaundice.

Keywords: Budd-Chiari syndrome, rare disease, abdominal pain, India

Procedia PDF Downloads 246
250 Rising Velocity of a Non-Newtonian Liquids in Capillary Tubes

Authors: Reza Sabbagh, Linda Hasanovich, Aleksey Baldygin, David S. Nobes, Prashant R. Waghmare

Abstract:

The capillary filling process is significantly important to study for numerous applications such as the under filling of the material in electronic packaging or liquid hydrocarbons seepage through porous structure. The approximation of the fluid being Newtonian, i.e., linear relationship between the shear stress and deformation rate cannot be justified in cases where the extent of non-Newtonian behavior of liquid governs the surface driven transport, i.e., capillarity action. In this study, the capillary action of a non-Newtonian fluid is not only analyzed, but also the modified generalized theoretical analysis for the capillary transport is proposed. The commonly observed three regimes: surface forces dominant (travelling air-liquid interface), developing flow (viscous force dominant), and developed regimes (interfacial, inertial and viscous forces are comparable) are identified. The velocity field along each regime is quantified with Newtonian and non-Newtonian fluid in square shaped vertically oriented channel. Theoretical understanding of capillary imbibition process, particularly in the case of Newtonian fluids, is relied on the simplified assumption of a fully developed velocity profile which has been revisited for developing a modified theory for the capillary transport of non-Newtonian fluids. Furthermore, the development of the velocity profile from the entrance regime to the developed regime, for different power law fluids, is also investigated theoretically and experimentally.

Keywords: capillary, non-Newtonian flow, shadowgraphy, rising velocity

Procedia PDF Downloads 178
249 The Effect of Aerobic Exercise Training on the Improvement of Nursing Staff's Sleep Quality: A Randomized Controlled Study

Authors: Niu Shu Fen

Abstract:

Sleep disturbance is highly prevalent among shift-working nurses. We aimed to evaluate whether aerobic exercise (i.e., walking combined with jogging) improves objective Sleepparameters among female nurses at the end of an 8-week exercise program and 4 weeks after study completion. This single-blinded, parallel design, randomized controlled trial was conducted in the floor classroom of a would-be medical center in northern Taiwan. Sixtyeligible female nurses were randomly assigned to either aerobic exercise (n = 30) or usual care (n = 30) group. The moderate-intensity aerobic exercise program was performed over 5days (60 min per day) a week for 8 weeks after work hours. Objective sleep outcomes including total sleep time (TST), sleep onset latency (SOL), wake after sleep onset (WASO), and sleep efficiency (SE), were retrieved using an Actigraph device. A generalized estimated equation model was used for data analyses. The aerobic exercise group had significant improvements in TST and SE at 4 weeks and 8 weeks compared with baseline evaluation(TST: B = 70.49 and 55.96, both p < 0.001; SE: B = 5.21 and 3.98, p < 0.001 and 0.002).Significant between-group differences were observed in SOL and WASO at 4 weeks but not8 weeks compared with the baseline evaluation (SOL: B = −7.18, p = 0.03; WASO: B =−11.38, p = 0.008). The positive lasting effects for TST were observed only until the 4-week follow-up. To improve sleep quality and quantity, we encourage female nurses to regularly perform moderate-intensity aerobic exercise.

Keywords: sleep quality, aerobic exercise, nurses, shift work

Procedia PDF Downloads 124
248 Modeling the Relation between Discretionary Accrual Earnings Management, International Financial Reporting Standards and Corporate Governance

Authors: Ikechukwu Ndu

Abstract:

This study examines the econometric modeling of the relation between discretionary accrual earnings management, International Financial Reporting Standards (IFRS), and certain corporate governance factors with regard to listed Nigerian non-financial firms. Although discretionary accrual earnings management is a well-known and global problem that has an adverse impact on users of the financial statements, its relationship with IFRS and corporate governance is neither adequately researched nor properly systematically investigated in Nigeria. The dearth of research in the relation between discretionary accrual earnings management, IFRS and corporate governance in Nigeria has made it difficult for academics, practitioners, government setting bodies, regulators and international bodies to achieve a clearer understanding of how discretionary accrual earnings management relates to IFRS and certain corporate governance characteristics. This is the first study to the author’s best knowledge to date that makes interesting research contributions that significantly add to the literature of discretionary accrual earnings management and its relation with corporate governance and IFRS pertaining to the Nigerian context. A comprehensive review is undertaken of the literature of discretionary total accrual earnings management, IFRS, and certain corporate governance characteristics as well as the data, models, methodologies, and different estimators used in the study. Secondary financial statement, IFRS, and corporate governance data are sourced from Bloomberg database and published financial statements of Nigerian non-financial firms for the period 2004 to 2016. The methodology uses both the total and working capital accrual basis. This study has a number of interesting preliminary findings. First, there is a negative relationship between the level of discretionary accrual earnings management and the adoption of IFRS. However, this relationship does not appear to be statistically significant. Second, there is a significant negative relationship between the size of the board of directors and discretionary accrual earnings management. Third, CEO Separation of roles does not constrain earnings management, indicating the need to preserve relationships, personal connections, and maintain bonded friendships between the CEO, Chairman, and executive directors. Fourth, there is a significant negative relationship between discretionary accrual earnings management and the use of a Big Four firm as an auditor. Fifth, including shareholders in the audit committee, leads to a reduction in discretionary accrual earnings management. Sixth, the debt and return on assets (ROA) variables are significant and positively related to discretionary accrual earnings management. Finally, the company size variable indicated by the log of assets is surprisingly not found to be statistically significant and indicates that all Nigerian companies irrespective of size engage in discretionary accrual management. In conclusion, this study provides key insights that enable a better understanding of the relationship between discretionary accrual earnings management, IFRS, and corporate governance in the Nigerian context. It is expected that the results of this study will be of interest to academics, practitioners, regulators, governments, international bodies and other parties involved in policy setting and economic development in areas of financial reporting, securities regulation, accounting harmonization, and corporate governance.

Keywords: discretionary accrual earnings management, earnings manipulation, IFRS, corporate governance

Procedia PDF Downloads 111
247 Assessing Influence of End-Boundary Conditions on Stability and Second-Order Lateral Stiffness of Beam-Column Elements Embedded in Non-Homogeneous Soil

Authors: Carlos A. Vega-Posada, Jeisson Alejandro Higuita-Villa, Julio C. Saldarriaga-Molina

Abstract:

This paper presents a simplified analytical approach to conduct elastic stability and second-order lateral stiffness analyses of beam-column elements (i.e., piles) with generalized end-boundary conditions embedded on a homogeneous or non-homogeneous Pasternak foundation. The solution is derived using the well-known Differential Transformation Method (DTM), and it consists simply of solving a system of two linear algebraic equations. Using other conventional approaches to solve the governing differential equation of the proposed element can be cumbersome and the solution challenging to implement, especially when the non-homogeneity of the soil is considered. The proposed formulation includes the effects of i) any rotational or lateral transverse spring at the ends of the pile, ii) any external transverse load acting along the pile, iii) soil non-homogeneity, and iv) the second-parameter of the elastic foundation (i.e., shear layer connecting the springs at the top). A parametric study is conducted to investigate the effects of different modulus of subgrade reactions, degrees of non-homogeneities, and intermediate end-boundary conditions on the pile response. The same set of equations can be used to conduct both elastic stability and static analyses. Comprehensive examples are presented to show the simplicity and practicability of the proposed method.

Keywords: elastic stability, second-order lateral stiffness, soil-non-homogeneity, pile analysis

Procedia PDF Downloads 181
246 Fintech Credit and Bank Efficiency Two-way Relationship: A Comparison Study Across Country Groupings

Authors: Tan Swee Liang

Abstract:

This paper studies the two-way relationship between fintech credit and banking efficiency using the Generalized panel Method of Moment (GMM) estimation in structural equation modeling (SEM). Banking system efficiency, defined as its ability to produce the existing level of outputs with minimal inputs, is measured using input-oriented data envelopment analysis (DEA), where the whole banking system of an economy is treated as a single DMU. Banks are considered an intermediary between depositors and borrowers, utilizing inputs (deposits and overhead costs) to provide outputs (increase credits to the private sector and its earnings). Analysis of the interrelationship between fintech credit and bank efficiency is conducted to determine the impact in different country groupings (ASEAN, Asia and OECD), in particular the banking system response to fintech credit platforms. Our preliminary results show that banks do respond to the greater pressure caused by fintech platforms to enhance their efficiency, but differently across the different groups. The author’s earlier research on ASEAN-5 high bank overhead costs (as a share of total assets) as the determinant of economic growth suggests that expenses may not have been channeled efficiently to income-generating activities. One practical implication of the findings is that policymakers should enable alternative financing, such as fintech credit, as a warning or encouragement for banks to improve their efficiency.

Keywords: fintech lending, banking efficiency, data envelopment analysis, structural equation modeling

Procedia PDF Downloads 61
245 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 57
244 Waterborne Platooning: Cost and Logistic Analysis of Vessel Trains

Authors: Alina P. Colling, Robert G. Hekkenberg

Abstract:

Recent years have seen extensive technological advancement in truck platooning, as reflected in the literature. Its main benefits are the improvement of traffic stability and the reduction of air drag, resulting in less fuel consumption, in comparison to using individual trucks. Platooning is now being adapted to the waterborne transport sector in the NOVIMAR project through the development of a Vessel Train (VT) concept. The main focus of VT’s, as opposed to the truck platoons, is the decrease in manning on board, ultimately working towards autonomous vessel operations. This crew reduction can prove to be an important selling point in achieving economic competitiveness of the waterborne approach when compared to alternative modes of transport. This paper discusses the expected benefits and drawbacks of the VT concept, in terms of the technical logistic performance and generalized costs. More specifically, VT’s can provide flexibility in destination choices for shippers but also add complexity when performing special manoeuvres in VT formation. In order to quantify the cost and performances, a model is developed and simulations are carried out for various case studies. These compare the application of VT’s in the short sea and inland water transport, with specific sailing regimes and technologies installed on board to allow different levels of autonomy. The results enable the identification of the most important boundary conditions for the successful operation of the waterborne platooning concept. These findings serve as a framework for future business applications of the VT.

Keywords: autonomous vessels, NOVIMAR, vessel trains, waterborne platooning

Procedia PDF Downloads 191
243 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 152
242 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain

Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA

Abstract:

In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.

Keywords: BER, DWT, extreme leaning machine (ELM), PSNR

Procedia PDF Downloads 284
241 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies

Authors: Tochukwu Timothy Okoli, Devi Datt Tewari

Abstract:

The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.

Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking

Procedia PDF Downloads 107
240 An Investigation of Performance Versus Security in Cognitive Radio Networks with Supporting Cloud Platforms

Authors: Kurniawan D. Irianto, Demetres D. Kouvatsos

Abstract:

The growth of wireless devices affects the availability of limited frequencies or spectrum bands as it has been known that spectrum bands are a natural resource that cannot be added. Many studies about available spectrum have been done and it shows that licensed frequencies are idle most of the time. Cognitive radio is one of the solutions to solve those problems. Cognitive radio is a promising technology that allows the unlicensed users known as secondary users (SUs) to access licensed bands without making interference to licensed users or primary users (PUs). As cloud computing has become popular in recent years, cognitive radio networks (CRNs) can be integrated with cloud platform. One of the important issues in CRNs is security. It becomes a problem since CRNs use radio frequencies as a medium for transmitting and CRNs share the same issues with wireless communication systems. Another critical issue in CRNs is performance. Security has adverse effect to performance and there are trade-offs between them. The goal of this paper is to investigate the performance related to security trade-off in CRNs with supporting cloud platforms. Furthermore, Queuing Network Models with preemptive resume and preemptive repeat identical priority are applied in this project to measure the impact of security to performance in CRNs with or without cloud platform. The generalized exponential (GE) type distribution is used to reflect the bursty inter-arrival and service times at the servers. The results show that the best performance is obtained when security is disable and cloud platform is enable.

Keywords: performance vs. security, cognitive radio networks, cloud platforms, GE-type distribution

Procedia PDF Downloads 325
239 Micro-Channel Flows Simulation Based on Nonlinear Coupled Constitutive Model

Authors: Qijiao He

Abstract:

MicroElectrical-Mechanical System (MEMS) is one of the most rapidly developing frontier research field both in theory study and applied technology. Micro-channel is a very important link component of MEMS. With the research and development of MEMS, the size of the micro-devices and the micro-channels becomes further smaller. Compared with the macroscale flow, the flow characteristics of gas in the micro-channel have changed, and the rarefaction effect appears obviously. However, for the rarefied gas and microscale flow, Navier-Stokes-Fourier (NSF) equations are no longer appropriate due to the breakup of the continuum hypothesis. A Nonlinear Coupled Constitutive Model (NCCM) has been derived from the Boltzmann equation to describe the characteristics of both continuum and rarefied gas flows. We apply the present scheme to simulate continuum and rarefied gas flows in a micro-channel structure. And for comparison, we apply other widely used methods which based on particle simulation or direct solution of distribution function, such as Direct simulation of Monte Carlo (DSMC), Unified Gas-Kinetic Scheme (UGKS) and Lattice Boltzmann Method (LBM), to simulate the flows. The results show that the present solution is in better agreement with the experimental data and the DSMC, UGKS and LBM results than the NSF results in rarefied cases but is in good agreement with the NSF results in continuum cases. And some characteristics of both continuum and rarefied gas flows are observed and analyzed.

Keywords: continuum and rarefied gas flows, discontinuous Galerkin method, generalized hydrodynamic equations, numerical simulation

Procedia PDF Downloads 143
238 A Varicella Outbreak in a Highly Vaccinated School Population in Voluntary 2-Dose Era in Beijing, China

Authors: Chengbin Wang, Li Lu, Luodan Suo, Qinghai Wang, Fan Yang, Xu Wang, Mona Marin

Abstract:

Background: Two-dose varicella vaccination has been recommended in Beijing since November 2012. We investigated a varicella outbreak in a highly vaccinated elementary school population to examine transmission patterns and risk factors for vaccine failure. Methods: A varicella case was defined as an acute generalized maculopapulovesicular rash without other apparent cause in a student attending the school from March 28 to May 17, 2015. Breakthrough varicella was defined as varicella >42 days after last vaccine dose. Vaccination information was collected from immunization records. Information on prior disease and clinical presentation was collected via survey of students’ parents. Results: Of the 1056 school students, 1028 (97.3%) reported no varicella history, of whom 364 (35.4%) had received 1-dose and 650 (63.2%) had received 2-dose varicella vaccine, for 98.6% school-wide vaccination coverage with ≥ 1 dose before the outbreak. A total of 20 cases were identified for an overall attack rate of 1.9%. The index case was in a 2-dose vaccinated student who was not isolated. The majority of cases were breakthrough (19/20, 95%) with attack rates of 7.1% (1/14), 1.6% (6/364) and 2.0% (13/650) among unvaccinated, 1-dose, and 2-dose students, respectively. Most cases had < 50 lesions (18/20, 90%). No difference was found between 1-dose and 2-dose breakthrough cases in disease severity or sociodemographic factors. Conclusion: Moderate 2-dose varicella vaccine coverage was insufficient to prevent a varicella outbreak. Two-dose breakthrough varicella is still contagious. High 2-dose varicella vaccine coverage and timely isolation of ill persons might be needed for varicella outbreak control in the 2-dose era.

Keywords: varicella, outbreak, breakthrough varicella, vaccination

Procedia PDF Downloads 303
237 The Brain’s Attenuation Coefficient as a Potential Estimator of Temperature Elevation during Intracranial High Intensity Focused Ultrasound Procedures

Authors: Daniel Dahis, Haim Azhari

Abstract:

Noninvasive image-guided intracranial treatments using high intensity focused ultrasound (HIFU) are on the course of translation into clinical applications. They include, among others, tumor ablation, hyperthermia, and blood-brain-barrier (BBB) penetration. Since many of these procedures are associated with local temperature elevation, thermal monitoring is essential. MRI constitutes an imaging method with high spatial resolution and thermal mapping capacity. It is the currently leading modality for temperature guidance, commonly under the name MRgHIFU (magnetic-resonance guided HIFU). Nevertheless, MRI is a very expensive non-portable modality which jeopardizes its accessibility. Ultrasonic thermal monitoring, on the other hand, could provide a modular, cost-effective alternative with higher temporal resolution and accessibility. In order to assess the feasibility of ultrasonic brain thermal monitoring, this study investigated the usage of brain tissue attenuation coefficient (AC) temporal changes as potential estimators of thermal changes. Newton's law of cooling describes a temporal exponential decay behavior for the temperature of a heated object immersed in a relatively cold surrounding. Similarly, in the case of cerebral HIFU treatments, the temperature in the region of interest, i.e., focal zone, is suggested to follow the same law. Thus, it was hypothesized that the AC of the irradiated tissue may follow a temporal exponential behavior during cool down regime. Three ex-vivo bovine brain tissue specimens were inserted into plastic containers along with four thermocouple probes in each sample. The containers were placed inside a specially built ultrasonic tomograph and scanned at room temperature. The corresponding pixel-averaged AC was acquired for each specimen and used as a reference. Subsequently, the containers were placed in a beaker containing hot water and gradually heated to about 45ᵒC. They were then repeatedly rescanned during cool down using ultrasonic through-transmission raster trajectory until reaching about 30ᵒC. From the obtained images, the normalized AC and its temporal derivative as a function of temperature and time were registered. The results have demonstrated high correlation (R² > 0.92) between both the brain AC and its temporal derivative to temperature. This indicates the validity of the hypothesis and the possibility of obtaining brain tissue temperature estimation from the temporal AC thermal changes. It is important to note that each brain yielded different AC values and slopes. This implies that a calibration step is required for each specimen. Thus, for a practical acoustic monitoring of the brain, two steps are suggested. The first step consists of simply measuring the AC at normal body temperature. The second step entails measuring the AC after small temperature elevation. In face of the urging need for a more accessible thermal monitoring technique for brain treatments, the proposed methodology enables a cost-effective high temporal resolution acoustical temperature estimation during HIFU treatments.

Keywords: attenuation coefficient, brain, HIFU, image-guidance, temperature

Procedia PDF Downloads 140
236 Application of Nonparametric Geographically Weighted Regression to Evaluate the Unemployment Rate in East Java

Authors: Sifriyani Sifriyani, I Nyoman Budiantara, Sri Haryatmi, Gunardi Gunardi

Abstract:

East Java Province has a first rank as a province that has the most counties and cities in Indonesia and has the largest population. In 2015, the population reached 38.847.561 million, this figure showed a very high population growth. High population growth is feared to lead to increase the levels of unemployment. In this study, the researchers mapped and modeled the unemployment rate with 6 variables that were supposed to influence. Modeling was done by nonparametric geographically weighted regression methods with truncated spline approach. This method was chosen because spline method is a flexible method, these models tend to look for its own estimation. In this modeling, there were point knots, the point that showed the changes of data. The selection of the optimum point knots was done by selecting the most minimun value of Generalized Cross Validation (GCV). Based on the research, 6 variables were declared to affect the level of unemployment in eastern Java. They were the percentage of population that is educated above high school, the rate of economic growth, the population density, the investment ratio of total labor force, the regional minimum wage and the ratio of the number of big industry and medium scale industry from the work force. The nonparametric geographically weighted regression models with truncated spline approach had a coefficient of determination 98.95% and the value of MSE equal to 0.0047.

Keywords: East Java, nonparametric geographically weighted regression, spatial, spline approach, unemployed rate

Procedia PDF Downloads 291
235 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time

Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar

Abstract:

The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.

Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors

Procedia PDF Downloads 49
234 Effect of Particle Aspect Ratio and Shape Factor on Air Flow inside Pulmonary Region

Authors: Pratibha, Jyoti Kori

Abstract:

Particles in industry, harvesting, coal mines, etc. may not necessarily be spherical in shape. In general, it is difficult to find perfectly spherical particle. The prediction of movement and deposition of non spherical particle in distinct airway generation is much more difficult as compared to spherical particles. Moreover, there is extensive inflexibility in deposition between ducts of a particular generation and inside every alveolar duct since particle concentrations can be much bigger than the mean acinar concentration. Consequently, a large number of particles fail to be exhaled during expiration. This study presents a mathematical model for the movement and deposition of those non-spherical particles by using particle aspect ratio and shape factor. We analyse the pulsatile behavior underneath sinusoidal wall oscillation due to periodic breathing condition through a non-Darcian porous medium or inside pulmonary region. Since the fluid is viscous and Newtonian, the generalized Navier-Stokes equation in two-dimensional coordinate system (r, z) is used with boundary-layer theory. Results are obtained for various values of Reynolds number, Womersley number, Forchsheimer number, particle aspect ratio and shape factor. Numerical computation is done by using finite difference scheme for very fine mesh in MATLAB. It is found that the overall air velocity is significantly increased by changes in aerodynamic diameter, aspect ratio, alveoli size, Reynolds number and the pulse rate; while velocity is decreased by increasing Forchheimer number.

Keywords: deposition, interstitial lung diseases, non-Darcian medium, numerical simulation, shape factor

Procedia PDF Downloads 152
233 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.

Keywords: floods, FLIKE, probability distributions, flood frequency, outlier

Procedia PDF Downloads 415