Search results for: standard normal variance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8487

Search results for: standard normal variance

7977 The Influence of Interior Decoration on Customer's Perception of Hotels in Uyo, Akwa Ibom State Nigeria

Authors: O. B. Enemuo, A. A. Onubuogu

Abstract:

This work evaluated the influence of interior of decoration on customer perception of hotels in Uyo Akwa Ibom State. Specifically the study identified the various interior decoration used in hotels in the study area, determined the interior decoration used in hotels that appeal to the customer more, ascertained the influence of interior decoration on the level of patronage in the hotel in the study area and suggested ways of improving the interior decoration of hotels in the study area for sustainability. The study was guided by four research questions and two hypotheses. It adopted survey research design; structured questionnaire was used for data collection. The samples for the study were four hundred (400) staff and managers from the various hotels in the study area. Data generated were analyzed using mean and standard deviation analyses of variance (ANOVA) derived from regression analyses to test the hypotheses. The result of the finding showed that satisfactory interior decoration has a positive influence on the sustainability of the hospitality establishments in Uyo. The hypothesis showed that there was a significant relationship between the gender perception on the influence of interior decoration in the hotel and significant relationship between the gender perceptions on the influence of interior decoration in the hotels. From the finding, it was recommended that the hotels should design interior decorative service delivery system which has an impact on customer satisfaction in the hospitality industry and practiced healthy decorative environment and increased customer satisfaction.

Keywords: influence, interior decoration, customer’s perception, hotels

Procedia PDF Downloads 295
7976 A Conundrum of Teachability and Learnability of Deaf Adult English as Second Language Learners in Pakistani Mainstream Classrooms: Integration or Elimination

Authors: Amnah Moghees, Saima Abbas Dar, Muniba Saeed

Abstract:

Teaching a second language to deaf learners has always been a challenge in Pakistan. Different approaches and strategies have been followed, but they have been resulted into partial or complete failure. The study aims to investigate the language problems faced by adult deaf learners of English as second language in mainstream classrooms. Moreover, the study also determines the factors which are very much involved in language teaching and learning in mainstream classes. To investigate the language problems, data will be collected through writing samples of ten deaf adult learners and ten normal ESL learners of the same class; whereas, observation in inclusive language teaching classrooms and interviews from five ESL teachers in inclusive classes will be conducted to know the factors which are directly or indirectly involved in inclusive language education. Keeping in view this study, qualitative research paradigm will be applied to analyse the corpus. The study figures out that deaf ESL learners face severe language issues such as; odd sentence structures, subject and verb agreement violation, misappropriation of verb forms and tenses as compared to normal ESL learners. The study also predicts that in mainstream classrooms there are multiple factors which are affecting the smoothness of teaching and learning procedure; role of mediator, level of deaf learners, empathy of normal learners towards deaf learners and language teacher’s training.

Keywords: deaf English language learner, empathy, mainstream classrooms, previous language knowledge of learners, role of mediator, language teachers' training

Procedia PDF Downloads 166
7975 Association Between Malnutrition and Dental Caries in Children

Authors: Mohammed Khalid Mahmood, Delphine Tardivo, Romain Lan

Abstract:

Dental caries is one of the most common diseases in the world, affecting billions of people and significantly lowering the quality of life. Malnutrition, on the other hand, is defined as inadequate, imbalanced, or excessive consumption of macronutrients, micronutrients, or both, which is characterized as an abnormal physiological condition. Oral health is impacted by malnutrition, and malnutrition can result from poor oral health. The objective of this paper was to study the association of serum Vitamin D level and body mass index as representatives of malnutrition at micro and macro levels, respectively, on dental caries. Results showed that: 1. The majority of the population studied (70%) are Vitamin D deficient. 2. Having a normal and even a sufficient level of serum Vitamin D and having a normal body mass index increase the chances of children being caries-free and having a lower caries index.

Keywords: children, dental Caries, malnutrition, vitamin D

Procedia PDF Downloads 80
7974 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates

Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim

Abstract:

The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.

Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM

Procedia PDF Downloads 410
7973 Electrical and Magnetoelectric Properties of (y)Li0.5Ni0.7Zn0.05Fe2O4 + (1-y)Ba0.5Sr0.5TiO3 Magnetoelectric Composites

Authors: S. U. Durgadsimi, S. Chouguleb, S. Belladc

Abstract:

(y) Li0.5Ni0.7Zn0.05Fe2O4 + (1-y) Ba0.5Sr0.5TiO3 magnetoelectric composites with y = 0.1, 0.3 and 0.5 were prepared by a conventional standard double sintering ceramic technique. X-ray diffraction analysis confirmed the phase formation of ferrite, ferroelectric and their composites. logρdc Vs 1/T graphs reveal that the dc resistivity decreases with increasing temperature exhibiting semiconductor behavior. The plots of logσac Vs logω2 are almost linear indicating that the conductivity increases with increase in frequency i.e, conductivity in the composites is due to small polaron hopping. Dielectric constant (έ) and dielectric loss (tan δ) were studied as a function of frequency in the range 100Hz–1MHz which reveals the normal dielectric behavior except the composite with y=0.1 and as a function of temperature at four fixed frequencies (i.e. 100Hz, 1KHz, 10KHz, 100KHz). ME voltage coefficient decreases with increase in ferrite content and was observed to be maximum of about 7.495 mV/cmOe for (0.1) Li0.5Ni0.7Zn0.05Fe2O4 + (0.9) Ba0.5Sr0.5TiO3 composite.

Keywords: XRD, dielectric constant, dielectric loss, DC and AC conductivity, ME voltage coefficient

Procedia PDF Downloads 344
7972 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 262
7971 Studies on Partial Replacement of Cement by Rice Husk Ash under Sodium Phosphate Medium

Authors: Dharmana Pradeep, Chandan Kumar Patnaikuni, N. V. S. Venugopal

Abstract:

Rice Husk Ash (RHA) is a green product contains carbon and also loaded with silica. For the development of durability and strength of any concrete, curing phenomenon shall be very important. In this communication, we reported the exposure of partial replacement of cement with RHA at different percentages of 0%, 5%, 7.5%, 10%, 12.5% and 15% by weight under sodium phosphate curing atmosphere. The mix is designed for M40 grade concrete with the proportions of 1:2.2:3.72. The tests conducted on concrete was a compressive strength, and the specimens were cured in normal water & exposed to the chemical solution for 7, 28 & 56 days. For chemical curing 0.5% & 1% concentrated sodium phosphates were used and were compared with normal concrete strength results. The strength of specimens of 1% sodium phosphate exposure showed that the compressive strength decreased with increase in RHA percentages.

Keywords: rice husk ash, compressive strength, sodium phosphate, curing

Procedia PDF Downloads 345
7970 Factors of Influence in Software Process Improvement: An ISO/IEC 29110 for Very-Small Entities

Authors: N. Wongsai, R. Wetprasit, V. Siddoo

Abstract:

The recently introduced ISO/IEC 29110 standard Lifecycle profile for Very Small Entities (VSE) has been adopted and practiced in many small and medium software companies, including in Thailand’s software industry. Many Thai companies complete their software process improvement (SPI) initiative program and have been certified. There are, however, a number of participants fail to success. This study was concerned with the factors that influence the accomplishment of the standard implementation in various VSE characteristics. In order to achieve this goal, exploring and extracting critical factors from prior studies were carried out and then the obtained factors were validated by the standard experts. Data analysis of comments and recommendations was performed using a qualitative content analysis method. This paper presents the initial set of influence factors in both positive and negative impact the ISO/IEC 29110 implementation with an aim at helping such SPI practitioners with some considerations to manage appropriate adoption approach in order to achieve its implementation.

Keywords: barriers, critical success factors, ISO/IEC 29110, Software Process Improvement, SPI, Very-Small Entity, VSE

Procedia PDF Downloads 315
7969 Finding the Longest Common Subsequence in Normal DNA and Disease Affected Human DNA Using Self Organizing Map

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Bioinformatics is an active research area which combines biological matter as well as computer science research. The longest common subsequence (LCSS) is one of the major challenges in various bioinformatics applications. The computation of the LCSS plays a vital role in biomedicine and also it is an essential task in DNA sequence analysis in genetics. It includes wide range of disease diagnosing steps. The objective of this proposed system is to find the longest common subsequence which presents in a normal and various disease affected human DNA sequence using Self Organizing Map (SOM) and LCSS. The human DNA sequence is collected from National Center for Biotechnology Information (NCBI) database. Initially, the human DNA sequence is separated as k-mer using k-mer separation rule. Mean and median values are calculated from each separated k-mer. These calculated values are fed as input to the Self Organizing Map for the purpose of clustering. Then obtained clusters are given to the Longest Common Sub Sequence (LCSS) algorithm for finding common subsequence which presents in every clusters. It returns nx(n-1)/2 subsequence for each cluster where n is number of k-mer in a specific cluster. Experimental outcomes of this proposed system produce the possible number of longest common subsequence of normal and disease affected DNA data. Thus the proposed system will be a good initiative aid for finding disease causing sequence. Finally, performance analysis is carried out for different DNA sequences. The obtained values show that the retrieval of LCSS is done in a shorter time than the existing system.

Keywords: clustering, k-mers, longest common subsequence, SOM

Procedia PDF Downloads 267
7968 Use of McCloskey/Mueller Satisfaction Scale in Evaluating Satisfaction with Working Conditions of Nurses in Slovakia

Authors: Vladimir Siska, Lukas Kober

Abstract:

Introduction: The research deals with the work satisfaction of nurses working in healthcare institutions in the Slovak Republic, and factors influencing it. Employers should create working conditions that are consonant with the requirements of their employees and make the most of motivation strategies to help them answer to the employess' needs in concordance with various needs and motivation process theories. Methodology: In our research, we aimed to investigate the level of work satisfaction in nurses by carrying out a quantitative analysis using the standardized McCloskey/Mueller Satisfaction scale questionnaire. We used the descriptive positioning characteristics (average, median and variability, standard deviation, minimum and maximum) to process the collected data and, to verify our hypotheses; we employed the double-selection Student T-test, Mann-Whitney U test, and a one-way analysis of variance (One-way ANOVA). Results: Nurses´satisfaction with external rewards is influenced by their age, years of experience, and level of completed education, with all of the abovementioned factors also impacting on the nurses' satisfaction with their work schedule. The type of founding authority of the healthcare institution also constitutes an influence on the nurses' satisfaction concerning relationships in the workplace. Conclusion: The feelling of work dissatisfaction can influence employees in many ways, e.g., it can take the form of burn-out syndrome, absenteeism, or increased fluctuation. Therefore, it is important to pay increased attention to all employees of an organisation, regardless of their position.

Keywords: motivation, nurse, work satisfaction, McCloskey/Mueller satisfaction scale

Procedia PDF Downloads 129
7967 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 397
7966 Impact of a Virtual Reality-Training on Real-World Hockey Skill: An Intervention Trial

Authors: Matthew Buns

Abstract:

Training specificity is imperative for successful performance of the elite athlete. Virtual reality (VR) has been successfully applied to a broad range of training domains. However, to date there is little research investigating the use of VR for sport training. The purpose of this study was to address the question of whether virtual reality (VR) training can improve real world hockey shooting performance. Twenty four volunteers were recruited and randomly selected to complete the virtual training intervention or enter a control group with no training. Four primary types of data were collected: 1) participant’s experience with video games and hockey, 2) participant’s motivation toward video game use, 3) participants technical performance on real-world hockey, and 4) participant’s technical performance in virtual hockey. One-way multivariate analysis of variance (ANOVA) indicated that that the intervention group demonstrated significantly more real-world hockey accuracy [F(1,24) =15.43, p <.01, E.S. = 0.56] while shooting on goal than their control group counterparts [intervention M accuracy = 54.17%, SD=12.38, control M accuracy = 46.76%, SD=13.45]. One-way multivariate analysis of variance (MANOVA) repeated measures indicated significantly higher outcome scores on real-world accuracy (35.42% versus 54.17%; ES = 1.52) and velocity (51.10 mph versus 65.50 mph; ES=0.86) of hockey shooting on goal. This research supports the idea that virtual training is an effective tool for increasing real-world hockey skill.

Keywords: virtual training, hockey skills, video game, esports

Procedia PDF Downloads 147
7965 Temperature Distribution Control for Baby Incubator System Using Arduino AT Mega 2560

Authors: W. Widhiada, D. N. K. P. Negara, P. A. Suryawan

Abstract:

The technological advances in the field of health to be very important, especially on the safety of the baby. In this case a lot of premature infants death caused by poorly managed health facilities. Mostly the death of premature baby caused by bacteria since the temperature around the baby is not normal. Related to this, the incubator equipment needs to be important, especially in how to control the temperature in incubator. On/Off controls is used to regulate the temperature distribution in the incubator so that the desired temperature is 36 °C to stay awake and stable. The authors have been observed and analyzed the data to determine the temperature distribution in the incubator using program of MATLAB/Simulink. The output temperature distribution is obtained at 36 °C in 400 seconds using an Arduino AT 2560. This incubator is able to maintain an ambient temperature and maintain the baby's body temperature within normal limits and keep the moisture in the air in accordance with the limit values required in infant incubator.

Keywords: on/off control, distribution temperature, Arduino AT 2560, baby incubator

Procedia PDF Downloads 502
7964 Epicardial Fat Necrosis in a Young Female: A Case Report

Authors: Tayyibah Shah Alam, Joe Thomas, Nayantara Shenoy

Abstract:

Presenting a case that we would like to share, the answer is straight forward but the path taken to get to the diagnosis is where it gets interesting. A 31-year-old lady presented to the Rheumatology Outpatient department with left-sided chest pain associated with left-sided elbow joint pain intensifying over the last 2 days. She had been having a prolonged history of chest pain with minimal intensity since 2016. The pain is intermittent in nature. Aggravated while exerting, lifting heavy weights and lying down. Relieved while sitting. Her physical examination and laboratory tests were within normal limits. An electrocardiogram (ECG) showed normal sinus rhythm and a chest X-ray with no significant abnormality was noted. The primary suspicion was recurrent costochondritis. Cardiac blood inflammatory markers and Echo were normal, ruling out ACS. CT chest and MRI Thorax contrast showed small ill-defined STIR hyperintensity with thin peripheral enhancement in the anterior mediastinum in the left side posterior to the 5th costal cartilage and anterior to the pericardium suggestive of changes in the fat-focal panniculitis. Confirming the diagnosis as Epicardial fat necrosis. She was started on Colchicine and Nonsteroidal anti-inflammatory drugs for 2-3 weeks, following which a repeat CT showed resolution of the lesion and improvement in her. It is often under-recognized or misdiagnosed. CT scan was collectively used to establish the diagnosis. Making the correct diagnosis prospectively alleviates unnecessary testing in favor of conservative management.

Keywords: EFN, panniculitis, unknown etiology, recurrent chest pain

Procedia PDF Downloads 97
7963 Experimental Investigation of On-Body Channel Modelling at 2.45 GHz

Authors: Hasliza A. Rahim, Fareq Malek, Nur A. M. Affendi, Azuwa Ali, Norshafinash Saudin, Latifah Mohamed

Abstract:

This paper presents the experimental investigation of on-body channel fading at 2.45 GHz considering two effects of the user body movement; stationary and mobile. A pair of body-worn antennas was utilized in this measurement campaign. A statistical analysis was performed by comparing the measured on-body path loss to five well-known distributions; lognormal, normal, Nakagami, Weibull and Rayleigh. The results showed that the average path loss of moving arm varied higher than the path loss in sitting position for upper-arm-to-left-chest link, up to 3.5 dB. The analysis also concluded that the Nakagami distribution provided the best fit for most of on-body static link path loss in standing still and sitting position, while the arm movement can be best described by log-normal distribution.

Keywords: on-body channel communications, fading characteristics, statistical model, body movement

Procedia PDF Downloads 355
7962 Factors of Adoption of the International Financial Reporting Standard for Small and Medium Sized Entities

Authors: Uyanga Jadamba

Abstract:

Globalisation of the world economy has necessitated the development and implementation of a comparable and understandable reporting language suitable for use by all reporting entities. The International Accounting Standard Board (IASB) provides an international reporting language that lets all users understand the financial information of their business and potentially allows them to have access to finance at an international level. The study is based on logistic regression analysis to investigate the factors for the adoption of theInternational Financial Reporting Standard for Small and Medium sized Entities (IFRS for SMEs). The study started with a list of 217 countries from World Bank data. Due to the lack of availability of data, the final sample consisted of 136 countries, including 60 countries that have adopted the IFRS for SMEs and 76 countries that have not adopted it yet. As a result, the study included a period from 2010 to 2020 and obtained 1360 observations. The findings confirm that the adoption of the IFRS for SMEs is significantly related to the existence of national reporting standards, law enforcement quality, common law (legal system), and extent of disclosure. It means that the likelihood of adoption of the IFRS for SMEs decreases if the country already has a national reporting standard for SMEs, which suggests that implementation and transitional costs are relatively high in order to change the reporting standards. The result further suggests that the new standard adoption is easier in countries with constructive law enforcement and effective application of laws. The finding also shows that the adoption increases if countries have a common law system which suggests that efficient reportingregulations are more widespread in these countries. Countries with a high extent of disclosing their financial information are more likely to adopt the standard than others. The findings lastly show that the audit qualityand primary education levelhave no significant impact on the adoption.One possible explanation for this could be that accounting professionalsfrom in developing countries lacked complete knowledge of the international reporting standards even though there was a requirement to comply with them. The study contributes to the literature by providing factors that impact the adoption of the IFRS for SMEs. It helps policymakers to better understand and apply the standard to improve the transparency of financial statements. The benefit of adopting the IFRS for SMEs is significant due to the relaxed and tailored reporting requirements for SMEs, reduced burden on professionals to comply with the standard, and provided transparent financial information to gain access to finance.The results of the study are useful toemerging economies where SMEs are dominant in the economy in informing its evaluation of the adoption of the IFRS for SMEs.

Keywords: IFRS for SMEs, international financial reporting standard, adoption, institutional factors

Procedia PDF Downloads 81
7961 The Practices and Challenges of Secondary School Cluster Supervisors in Implementing School Improvement Program in Saesie Tsaeda Emba Woreda, Eastern Zone of Tigray Region

Authors: Haftom Teshale Gebre

Abstract:

According to the ministry of education’s school improvement program blueprint document (2007), the timely and basic aim of the program is to improve students’ academic achievement through creating conducive teaching and learning environments and with the active involvement of parents in the teaching and learning process. The general objective of the research is to examine the practices of cluster school supervisors in implementing school improvement programs and the major factors affecting the study area. The study used both primary and secondary sources, and the sample size was 93. Twelve people are chosen from each of the two clusters (Edaga Hamus and Adi-kelebes). And cluster ferewyni are Tekli suwaat, Edaga robue, and Kiros Alemayo. In the analysis stage, several interrelated pieces of information were summarized and arranged to make the analysis easily manageable by using statistics and data (STATA). Study findings revealed that the major four domains impacted by school improvement programs through their mean, standard deviation, and variance were 2.688172, 1.052724, and 1.108228, respectively. And also, the researcher can conclude that the major factors of the school improvement program and mostly cluster supervisors were inadequate attention given to supervision service and no experience in the practice of supervision in the study area.

Keywords: cluster, eastern Tigray, Saesie Tsaeda Emba, SPI

Procedia PDF Downloads 32
7960 The Possibility of Using Somatosensory Evoked Potential(SSEP) as a Parameter for Cortical Vascular Dementia

Authors: Hyunsik Park

Abstract:

As the rate of cerebrovascular disease increases in old populations, the prevalence rate of vascular dementia would be expected. Therefore, authors designed this study to find out the possibility of somatosensory evoked potentials(SSEP) as a parameter for early diagnosis and prognosis prediction of vascular dementia in cortical vascular dementia patients. 21 patients who met the criteria for vascular dementia according to DSM-IV,ICD-10and NINDS-AIREN with the history of recent cognitive impairment, fluctuation progression, and neurologic deficit. We subdivided these patients into two groups; a mild dementia and a severe dementia groups by MMSE and CDR score; and analysed comparison between normal control group and patient control group who have been cerebrovascular attack(CVA) history without dementia by using N20 latency and amplitude of median nerve. In this study, mild dementia group showed significant differences on latency and amplitude with normal control group(p-value<0.05) except patient control group(p-value>0.05). Severe dementia group showed significant differences both normal control group and patient control group.(p-value<0.05, <001). Since no significant difference has founded between mild dementia group and patient control group, SSEP has limitation to use for early diagnosis test. However, the comparison between severe dementia group and others showed significant results which indicate SSEP can predict the prognosis of vascular dementia in cortical vascular dementia patients.

Keywords: SSEP, cortical vascular dementia, N20 latency, N20 amplitude

Procedia PDF Downloads 304
7959 Trashing Customary International Law Comprehensive Evaluation

Authors: Hamid Vahidkia

Abstract:

Central to the World Court’s mission is the assurance of universal custom “as prove of a common hone acknowledged as law.” Understudies of the Court’s law have long been mindful that the Court has been superior at applying standard law than characterizing it. However until Nicaragua v. Joined together States, small hurt was done. For within the strongly challenged cases earlier to Nicaragua, the Court overseen to inspire commonalities in factious structure that floated its decisions toward the standard standards certain in state hone. The Court’s need of hypothetical unequivocality basically implied that a career opportunity emerged for a few eyewitnesses like me to endeavor to supply the lost hypothesis of custom.

Keywords: law, international law, jurisdication, customary

Procedia PDF Downloads 61
7958 Effect of Palm Oil Mill Effluent on Microbial Composition in Soil Samples in Isiala Mbano Lga

Authors: Eze Catherine Chinwe, J. D. Njoku

Abstract:

Background: Palm oil mill effluent is the voluminous liquid waste that comes from the sterilization and clarification sections of the oil palm milling process. The raw effluent contains 90-95% water and includes residual oil, soil particles, and suspended solids. Palm oil mill effluent is a highly polluting material and much research has been dedicated to means of alleviating its threat to the environment. Objectives: 1. To compare Physico-chemical and microbiological analysis of soil samples from POME and non-POME sites. 2. To make recommendations on how best to handle POME in the study area. Methods: Quadrant approach was adopted for sampling POME (A) and Non POME (B) locations. Qualities were determined using standard analytical procedures. Conclusions: Results of the analysis were obtained in the following range; pH (3.940 –7.435), dissolved oxygen (DO) (1.582–6.234mg/l), biological oxygen demand (BOD) (50–5463mg/l etc. For the various locations, the population of total heterotrophic bacteria (THB) ranged from 1.36x106–2.42x106 cfu/ml, the total heterotrophic fungi (THF) ranged from 1.22–3.05 x 104 cfu/ml. The frequency of occurrence revealed the microbial isolates Pseudomonas sp., Bacillus sp., Staphylococcus, as the most frequently occurring isolates. Analysis of variance showed that there were significant differences (P<0.05) in microbial populations among locations. The discharge of industrial effluents into the soil in Nigeria invariably results in the presence of high concentrations of pollutant in the soil environment.

Keywords: effluents, mirobial composition, soil samples, isiala mbano

Procedia PDF Downloads 314
7957 Challenges and Opportunities in Computing Logistics Cost in E-Commerce Supply Chain

Authors: Pramod Ghadge, Swadesh Srivastava

Abstract:

Revenue generation of a logistics company depends on how the logistics cost of a shipment is calculated. Logistics cost of a shipment is a function of distance & speed of the shipment travel in a particular network, its volumetric size and dead weight. Logistics billing is based mainly on the consumption of the scarce resource (space or weight carrying capacity of a carrier). Shipment’s size or deadweight is a function of product and packaging weight, dimensions and flexibility. Hence, to arrive at a standard methodology to compute accurate cost to bill the customer, the interplay among above mentioned physical attributes along with their measurement plays a key role. This becomes even more complex for an ecommerce company, like Flipkart, which caters to shipments from both warehouse and marketplace in an unorganized non-standard market like India. In this paper, we will explore various methodologies to define a standard way of billing the non-standard shipments across a wide range of size, shape and deadweight. Those will be, usage of historical volumetric/dead weight data to arrive at a factor which can be used to compute the logistics cost of a shipment, also calculating the real/contour volume of a shipment to address the problem of irregular shipment shapes which cannot be solved by conventional bounding box volume measurements. We will also discuss certain key business practices and operational quality considerations needed to bring standardization and drive appropriate ownership in the ecosystem.

Keywords: contour volume, logistics, real volume, volumetric weight

Procedia PDF Downloads 269
7956 Comparison of Volume of Fluid Model: Experimental and Empirical Results for Flows over Stacked Drop Manholes

Authors: Ramin Mansouri

Abstract:

The manhole is one of the types of structures that are installed at the site of change direction or change in the pipe diameter or sewage pipes as well as in step slope areas to reduce the flow velocity. In this study, the flow characteristics of hydraulic structures in a manhole structure have been investigated with a numerical model. In this research, the types of computational grid coarse, medium, and fines have been used for simulation. In order to simulate flow, k-ε model (standard, RNG, Realizable) and k-w model (standard SST) are used. Also, in order to find the best wall conditions, two types of standard and non-equilibrium wall functions were investigated. The turbulent model k-ε has the highest correlation with experimental results or all models. In terms of boundary conditions, constant speed is set for the flow input boundary, the output pressure is set in the boundaries which are in contact with the air, and the standard wall function is used for the effect of the wall function. In the numerical model, the depth at the output of the second manhole is estimated to be less than that of the laboratory and the output jet from the span. In the second regime, the jet flow collides with the manhole wall and divides into two parts, so hydraulic characteristics are the same as large vertical shaft hydraulic characteristics. In this situation, the turbulence is in a high range since it can be seen more energy loss in it. According to the results, energy loss in numerical is estimated at 9.359%, which is more than experimental data.

Keywords: manhole, energy, depreciation, turbulence model, wall function, flow

Procedia PDF Downloads 82
7955 Public Transport Planning System by Dijkstra Algorithm: Case Study Bangkok Metropolitan Area

Authors: Pimploi Tirastittam, Phutthiwat Waiyawuththanapoom

Abstract:

Nowadays the promotion of the public transportation system in the Bangkok Metropolitan Area is increased such as the “Free Bus for Thai Citizen” Campaign and the prospect of the several MRT routes to increase the convenient and comfortable to the Bangkok Metropolitan area citizens. But citizens do not make full use of them it because the citizens are lack of the data and information and also the confident to the public transportation system of Thailand especially in the time and safety aspects. This research is the Public Transport Planning System by Dijkstra Algorithm: Case Study Bangkok Metropolitan Area by focusing on buses, BTS and MRT schedules/routes to give the most information to passengers. They can choose the way and the routes easily by using Dijkstra STAR Algorithm of Graph Theory which also shows the fare of the trip. This Application was evaluated by 30 normal users to find the mean and standard deviation of the developed system. Results of the evaluation showed that system is at a good level of satisfaction (4.20 and 0.40). From these results we can conclude that the system can be used properly and effectively according to the objective.

Keywords: Dijkstra algorithm, graph theory, public transport, Bangkok metropolitan area

Procedia PDF Downloads 247
7954 Cardio Autonomic Response during Mental Stress in the Wards of Normal and Hypertensive Parents

Authors: Sheila R. Pai, Rekha D. Kini, Amrutha Mary

Abstract:

Objective: To assess and compare the cardiac autonomic activity after mental stress among the wards of normal and hypertensive parents. Methods: The study included 67 subjects, 30 of them had a parental history of hypertension and rest 37 had normotensive parents. Subjects were divided into control group (wards of normotensive parents) and Study group (wards of hypertensive parents). The height, weight were noted, and Body Mass Index (BMI) was also calculated. The mental stress test was carried out. Blood pressure (BP) and electro cardiogram (ECG) was recorded during normal breathing and after mental stress test. Heart rate variability (HRV) analysis was done by time domain method HRV was recorded and analyzed by the time-domain method. Analysis of HRV in the time-domain was done using the software version 1.1 AIIMS, New Delhi. The data obtained was analyzed using student’s t-test followed by Mann-Whitney U-test and P < 0.05 was considered significant. Results: There was no significant difference in systolic blood pressure and diastolic blood pressure (DBP) between study group and control group following mental stress. In the time domain analysis, the mean value of pNN50 and RMSSD of the study group was not significantly different from the control group after the mental stress test. Conclusion: The study thus concluded that there was no significant difference in HRV between study group and control group following mental stress.

Keywords: heart rate variability, time domain analysis, mental stress, hypertensive

Procedia PDF Downloads 273
7953 Analysis of EEG Signals Using Wavelet Entropy and Approximate Entropy: A Case Study on Depression Patients

Authors: Subha D. Puthankattil, Paul K. Joseph

Abstract:

Analyzing brain signals of the patients suffering from the state of depression may lead to interesting observations in the signal parameters that is quite different from a normal control. The present study adopts two different methods: Time frequency domain and nonlinear method for the analysis of EEG signals acquired from depression patients and age and sex matched normal controls. The time frequency domain analysis is realized using wavelet entropy and approximate entropy is employed for the nonlinear method of analysis. The ability of the signal processing technique and the nonlinear method in differentiating the physiological aspects of the brain state are revealed using Wavelet entropy and Approximate entropy.

Keywords: EEG, depression, wavelet entropy, approximate entropy, relative wavelet energy, multiresolution decomposition

Procedia PDF Downloads 332
7952 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 54
7951 A Comparative Analysis of Carbon Footprints of Households in Different Housing Types and Seasons

Authors: Taehyun Kim

Abstract:

As a result of rapid urbanization, energy demands for lighting, heating and cooling of households have been concentrated in metropolitan areas. The energy resources for housing in urban areas are dominantly fossil fuel whose uses contribute to increase cost of living and carbon dioxide (CO2) emission. To achieve environmentally and economically sustainable residential development, it is important to know how energy use and cost of living can be reduced by planning and design. The purpose of this study is to examine which type of building requires less energy for housing. To do so, carbon footprint (CF) quiz survey was employed which estimates the amount of carbon dioxide required to support households’ consumption of energy uses for housing. The housing carbon footprints (HCF) of 500 households of Seoul, Korea in summer and winter were estimated and compared in three major types of housing: single-family (detached), row-house and apartment. In addition, its differences of HCF were estimated between tower and flat type of apartment. The results of T-test and analysis of variance (ANOVA) provide statistical evidence that housing type is related to housing energy use. Average HCF of detached house was higher than other housing types. Between two types of apartment, tower type shows higher HCF than flat type in winter. These findings may provide new perspectives on CF application in sustainable architecture and urban design.

Keywords: analysis of variance, carbon footprint, energy use, housing type

Procedia PDF Downloads 505
7950 Study of the Anti-Diabetic Activity of the Common Fig in the Region of the El Amra (Ain Defla), Algeria

Authors: Meliani Samiha, Hassaine Sarah

Abstract:

Figs are so much consumed in the Mediterranean region; they present a high nutritional value and also multiple therapeutic virtues. Our work contributes to the study of the antidiabetic activity of the common fig of the region of El Amra (AinDefla) Algeria. To do this, 20 Wistar rats female, divided into 4 lots, were used: Lot 1: 5 normal controls; Lot 2: 5 normal controls treated with dry fig juice at 20%; Lot 3: 5 diabetic controls; Lot 4: 5 diabetic controls treated with dry fig juice at 20%. The rats are rendered diabetic by intra-peritoneal injection of a streptozotocin solution. The blood glucose is measured after 1 hour, 2 hours, 3 hours and after 4 hours of the administration of the fig juice; it’s measured also on the 5th day, 8th day and 9th day of the beginning of the experiment. The determination of cholesterol and triglycerides blood is carried out at the beginning and the end of the study. On the 9th day, we recorded a very significant decrease of the blood sugar level of diabetic rats treated with dry fig juice. This blood glucose level normalized for 3 rats/5rats, we also recorded a decrease, but not significant, of cholesterol and triglycerides blood levels. In the short term (for 4 hours), an increase of blood sugar level, one hour after administration, for normal and diabetic rats. This increase is probably due to the high level of sugar content in the preparation. The blood glucose level is then corrected, four hours later. This may be the result of anti hyperglycemic effect of the active ingredients contained in the figs.

Keywords: antidiabetic, figs, hypoglycemia, streptozotocin

Procedia PDF Downloads 218
7949 Performance Comparison of Wideband Covariance Matrix Sparse Representation (W-CMSR) with Other Wideband DOA Estimation Methods

Authors: Sandeep Santosh, O. P. Sahu

Abstract:

In this paper, performance comparison of wideband covariance matrix sparse representation (W-CMSR) method with other existing wideband Direction of Arrival (DOA) estimation methods has been made.W-CMSR relies less on a priori information of the incident signal number than the ordinary subspace based methods.Consider the perturbation free covariance matrix of the wideband array output. The diagonal covariance elements are contaminated by unknown noise variance. The covariance matrix of array output is conjugate symmetric i.e its upper right triangular elements can be represented by lower left triangular ones.As the main diagonal elements are contaminated by unknown noise variance,slide over them and align the lower left triangular elements column by column to obtain a measurement vector.Simulation results for W-CMSR are compared with simulation results of other wideband DOA estimation methods like Coherent signal subspace method (CSSM), Capon, l1-SVD, and JLZA-DOA. W-CMSR separate two signals very clearly and CSSM, Capon, L1-SVD and JLZA-DOA fail to separate two signals clearly and an amount of pseudo peaks exist in the spectrum of L1-SVD.

Keywords: W-CMSR, wideband direction of arrival (DOA), covariance matrix, electrical and computer engineering

Procedia PDF Downloads 471
7948 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 134