Search results for: “User acceptance of computer technology:A comparison of two theoretical models ”
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23368

Search results for: “User acceptance of computer technology:A comparison of two theoretical models ”

8068 Learning Object Repositories as Developmental Resources for Educational Institutions in the 21st Century

Authors: Hanan A. Algamdi, Huda Y. Alyami

Abstract:

Learning object repositories contribute to developing educational process through its advantages; as they employ technology effectively, and use it to create new resources for effective learning, as well as they provide opportunities for collaboration in content through providing the ability for editing, modifying and developing it. This supports the relationships between communities that benefit from these repositories, and reflects positively on the content quality. Therefore, this study aims at exploring the most prominent learning topics in the 21st century, which should be included in learning object repositories, and identifying the necessary set of learning skills that the repositories should develop among today students. For conducting this study, the analytical descriptive method will be employed, and study sample will include a group of leaders, experts, and specialists in curricula and e-learning at ministry of education in Kingdom of Saudi Arabia.

Keywords: learning object, repositories, 21st century, quality

Procedia PDF Downloads 291
8067 Health as an Agenda in Indian Politics: A Study of Election Manifestos in 16th General Elections

Authors: Kiran Bala

Abstract:

Health, education and employment opportunities available for a common citizen reflect the development status of a country. Health of an individual affects the growth of a country in every aspect. According to a study by WHO, India is estimated to lose more than $237 billion of its GDP over the period 2006-15 on account of premature death and morbidity from Non-communicable diseases alone. Each year 37 million people fall below poverty line due to high expenditure on health services they have to incur. Falling sick puts a double burden on them in terms of loss of income and expenditure on health care which pushes them further into debt and poverty. Adding to the gravity of situation, public spending on health in India has itself declined after liberalization from 1.3% of GDP in 1990 to 0.9% in 1999. The Approach Paper of the Government of India to the Twelfth Five Year Plan indicated that health expenditure alone as a per cent of GDP was about 1.4 per cent (B.E.) in 2011-12. It also mentioned that if one included expenditure on rural water supply and sanitation, the figure would be about 1.8 per cent. Given the abysmally low level of priority accorded to health in Indian economic policy, it becomes rather important to study the representation of health in the Indian public sphere. To this end, this study examines the prioritization of health in the public policy agenda of the national/regional political parties as evidenced in their election manifestos at a time when the nation is poised to go for the General Elections. The paper also focuses attention on the prioritization of health in the public perception as evidenced in their reasons for their preferences for a particular party or individual contestant. To arrive at the reasons for the priority level accorded by the political actors and the citizens, the study uses Focus groups of health policy makers, media persons, medical practitioners and voters. Collected data will be analysed in the theoretical framework of spiral of silence and agenda setting theory.

Keywords: health, election manifestos, public perception, policies

Procedia PDF Downloads 339
8066 The Impact of Coffee Consumption to Body Mass Index and Body Composition

Authors: A.L. Tamm, N. Šott, J. Jürimäe, E. Lätt, A. Orav, Ü. Parm

Abstract:

Coffee is one of the most frequently consumed beverages in the world but still its effects on human organism are not completely understood. Coffee has also been used as a method for weight loss, but its effectiveness has not been proved. There is also not similar comprehension in classifying overweight in choosing between body mass index (BMI) and fat percentage (fat%). The aim of the study was to determine associations between coffee consumption and body composition. Secondly, to detect which measure (BMI or fat%) is more accurate to use describing overweight. Altogether 103 persons enrolled the study and divided into three groups: coffee non-consumers (n=39), average coffee drinkers, who consumed 1 to 4 cups (1 cup = ca 200ml) of coffee per day (n=40) and excessive coffee consumers, who drank at least five cups of coffee per day (n=24). Body mass (medical electronic scale, A&D Instruments, Abingdon, UK) and height (Martin metal anthropometer to the nearest 0.1 cm) were measured and BMI calculated (kg/m2). Participants´ body composition was detected with dual energy X-ray absorptiometry (DXA, Hologic) and general data (history of chronic diseases included) and information about coffee consumption, and physical activity level was collected with questionnaires. Results of the study showed that excessive coffee consumption was associated with increased fat-free mass. It could be foremost due to greater physical activity level in school time or greater (not significant) male proportion in excessive coffee consumers group. For estimating the overweight the fat% in comparison to BMI recommended, as it gives more accurate results evaluating chronical disease risks. In conclusion coffee consumption probably does not affect body composition and for estimating the body composition fat% seems to be more accurate compared with BMI.

Keywords: body composition, body fat percentage, body mass index, coffee consumption

Procedia PDF Downloads 400
8065 Civilization and Violence: Islam, the West, and the Rest

Authors: Imbesat Daudi

Abstract:

One of the most discussed topics of the last century happens to be if Islamic civilization is violent. Many Western intellectuals have promoted the notion that Islamic civilization is violent. Citing 9/11, in which 3000 civilians were killed, they argue that Muslims are prone to violence because Islam promotes violence. However, Muslims reject this notion as nonsense. This topic has not been properly addressed. First, violence of civilizations cannot be proven by citing religious texts, which have been used in discussions over civilizational violence. Secondly, the question of whether Muslims are violent is inappropriate, as there is implicit bias suggesting that Islamic civilization is violent. A proper question should be which civilization is more violent. Third, whether Islamic civilization is indeed violent can only be established if more war-related casualties can be documented within the borders of Islamic civilization than that of their cohorts. This has never been done. Finally, the violent behavior of Muslim countries can be examined by comparing acts of violence committed by Muslim countries with acts of violence of groups of nations belonging to other civilizations by appropriate parameters of violence. Therefore, parameters reflecting group violence have been defined; violent conflicts of various civilizations of the last two centuries were documented, quantified by number of conflicts and number of victims, and compared with each other by following the established principles of statistics. The results show that whereas 80% of genocides and massacres were conducted by Western nations, less than 5% of acts of violence were committed by Muslim countries. Furthermore, the West has the highest incidence (new) and prevalence (new and old) of violent conflicts among all groups of nations. The result is unambiguous and statistically significant. Becoming informed can only be done by a methodical collection of relevant data, objective analysis of data, and unbiased information, a process which this paper follows.

Keywords: Islam and violence, demonization of Muslims, violence and the West, comparison of civilizational violence

Procedia PDF Downloads 39
8064 Effect of Large English Studies Classes on Linguistic Achievement and Classroom Discourse at Junior Secondary Level in Yobe State

Authors: Clifford Irikefe Gbeyonron

Abstract:

Applied linguists concur that there is low-level achievement in English language use among Nigerian secondary school students. One of the factors that exacerbate this is classroom feature of which large class size is obvious. This study investigated the impact of large classes on learning English as a second language (ESL) at junior secondary school (JSS) in Yobe State. To achieve this, Solomon four-group experimental design was used. 382 subjects were divided into four groups and taught ESL for thirteen weeks. 356 subjects wrote the post-test. Data from the systematic observation and post-test were analyzed via chi square and ANOVA. Results indicated that learners in large classes (LLC) attain lower linguistic progress than learners in small classes (LSC). Furthermore, LSC have more chances to access teacher evaluation and participate actively in classroom discourse than LLC. In consequence, large classes have adverse effects on learning ESL in Yobe State. This is inimical to English language education given that each learner of ESL has their individual peculiarity within each class. It is recommended that strategies that prioritize individualization, grouping, use of language teaching aides, and theorization of innovative models in respect of large classes be considered.

Keywords: large classes, achievement, classroom discourse

Procedia PDF Downloads 393
8063 Database Management System for Orphanages to Help Track of Orphans

Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta

Abstract:

Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.

Keywords: database, orphans, programming, C⁺⁺

Procedia PDF Downloads 129
8062 Current Status of Scaled-Up Synthesis/Purification and Characterization of a Potentially Translatable Tantalum Oxide Nanoparticle Intravenous CT Contrast Agent

Authors: John T. Leman, James Gibson, Peter J. Bonitatibus

Abstract:

There have been no potential clinically translatable developments of intravenous CT contrast materials over decades, and iodinated contrast agents (ICA) remain the only FDA-approved media for CT. Small molecule ICA used to highlight vascular anatomy have weak CT signals in large-to-obese patients due to their rapid redistribution from plasma into interstitial fluid, thereby diluting their intravascular concentration, and because of a mismatch of iodine’s K-edge and the high kVp settings needed to image this patient population. The use of ICA is also contraindicated in a growing population of renally impaired patients who are hypersensitive to these contrast agents; a transformative intravenous contrast agent with improved capabilities is urgently needed. Tantalum oxide nanoparticles (TaO NPs) with zwitterionic siloxane polymer coatings have high potential as clinically translatable general-purpose CT contrast agents because of (1) substantially improved imaging efficacy compared to ICA in swine/phantoms emulating medium-sized and larger adult abdomens and superior thoracic vascular contrast enhancement of thoracic arteries and veins in rabbit, (2) promising biological safety profiles showing near-complete renal clearance and low tissue retention at 3x anticipated clinical dose (ACD), and (3) clinically acceptable physiochemical parameters as concentrated bulk solutions(250-300 mgTa/mL). Here, we review requirements for general-purpose intravenous CT contrast agents in terms of patient safety, X-ray attenuating properties and contrast-producing capabilities, and physicochemical and pharmacokinetic properties. We report the current status of a TaO NP-based contrast agent, including chemical process technology developments and results of newly defined scaled-up processes for NP synthesis and purification, yielding reproducible formulations with appropriate size and concentration specifications. We discuss recent results of recent pre-clinical in vitro immunology, non-GLP high dose tolerability in rats (10x ACD), non-GLP long-term biodistribution in rats at 3x ACD, and non-GLP repeat dose in rats at ACD. We also include a discussion of NP characterization, in particular size-stability testing results under accelerated conditions (37C), and insights into TaO NP purity, surface structure, and bonding of the zwitterionic siloxane polymer coating by multinuclear (1H, 13C, 29Si) and multidimensional (2D) solution NMR spectroscopy.

Keywords: nanoparticle, imaging, diagnostic, process technology, nanoparticle characterization

Procedia PDF Downloads 7
8061 Voice over IP Quality of Service Evaluation for Mobile Ad Hoc Network in an Indoor Environment for Different Voice Codecs

Authors: Lina Abou Haibeh, Nadir Hakem, Ousama Abu Safia

Abstract:

In this paper, the performance and quality of Voice over IP (VoIP) calls carried over a Mobile Ad Hoc Network (MANET) which has a number of SIP nodes registered on a SIP Proxy are analyzed. The testing campaigns are carried out in an indoor corridor structure having a well-defined channel’s characteristics and model for the different voice codecs, G.711, G.727 and G.723.1. These voice codecs are commonly used in VoIP technology. The calls’ quality are evaluated using four Quality of Service (QoS) metrics, namely, mean opinion score (MOS), jitter, delay, and packet loss. The relationship between the wireless channel’s parameters and the optimum codec is well-established. According to the experimental results, the voice codec G.711 has the best performance for the proposed MANET topology

Keywords: wireless channel modelling, Voip, MANET, session initiation protocol (SIP), QoS

Procedia PDF Downloads 211
8060 Paradigm Shift in Classical Drug Research: Challenges to Mordern Pharmaceutical Sciences

Authors: Riddhi Shukla, Rajeshri Patel, Prakruti Buch, Tejas Sharma, Mihir Raval, Navin Sheth

Abstract:

Many classical drugs are claimed to have blood sugar lowering properties that make them valuable for people with or at high risk of type 2 diabetes. Vijaysar (Pterocarpus marsupium) and Gaumutra (Indian cow urine) both have been shown antidiabetic property since primordial time and both shows synergistic effect in combination for hypoglycaemic activity. The study was undertaken to investigate the hypoglycaemic and anti-diabetic effects of the combination of Vijaysar and Gaumutra which is a classical preparation mentioned in Ayurveda named as Pramehari ark. Rats with Type 2 diabetes which is induced by streptozotocin (STZ, 35mg/kg) given a high-fat diet for one month and compared with normal rats. Diabetic rats showed raised level of body weight, triglyceride (TG), total cholesterol, HDL, LDL, and D-glucose concentration and other serum, cardiac and hypertrophic parameters in comparison of normal rats. After treatment of different doses of drug the level of parameters like TG, total cholesterol, HDL, LDL, and D-glucose concentration found to be decreased in standard as well as in treatment groups. In addition treatment groups also found to be decreased in the level of serum markers, cardiac markers, and hypertrophic parameters. The findings demonstrated that Pramehari ark prevented the pathological progression of type 2 diabetes in rats.

Keywords: cow urine, hypoglycemic effect, synergic effect, type 2 diabetes, vijaysar

Procedia PDF Downloads 265
8059 Perforation Analysis of the Aluminum Alloy Sheets Subjected to High Rate of Loading and Heated Using Thermal Chamber: Experimental and Numerical Approach

Authors: A. Bendarma, T. Jankowiak, A. Rusinek, T. Lodygowski, M. Klósak, S. Bouslikhane

Abstract:

The analysis of the mechanical characteristics and dynamic behavior of aluminum alloy sheet due to perforation tests based on the experimental tests coupled with the numerical simulation is presented. The impact problems (penetration and perforation) of the metallic plates have been of interest for a long time. Experimental, analytical as well as numerical studies have been carried out to analyze in details the perforation process. Based on these approaches, the ballistic properties of the material have been studied. The initial and residual velocities laser sensor is used during experiments to obtain the ballistic curve and the ballistic limit. The energy balance is also reported together with the energy absorbed by the aluminum including the ballistic curve and ballistic limit. The high speed camera helps to estimate the failure time and to calculate the impact force. A wide range of initial impact velocities from 40 up to 180 m/s has been covered during the tests. The mass of the conical nose shaped projectile is 28 g, its diameter is 12 mm, and the thickness of the aluminum sheet is equal to 1.0 mm. The ABAQUS/Explicit finite element code has been used to simulate the perforation processes. The comparison of the ballistic curve was obtained numerically and was verified experimentally, and the failure patterns are presented using the optimal mesh densities which provide the stability of the results. A good agreement of the numerical and experimental results is observed.

Keywords: aluminum alloy, ballistic behavior, failure criterion, numerical simulation

Procedia PDF Downloads 298
8058 Iterative Method for Lung Tumor Localization in 4D CT

Authors: Sarah K. Hagi, Majdi Alnowaimi

Abstract:

In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.

Keywords: automated algorithm , computed tomography, lung tumor, tumor localization

Procedia PDF Downloads 589
8057 The Use of Webquests in Developing Inquiry Based Learning: Views of Teachers and Students in Qatar

Authors: Abdullah Abu-Tineh, Carol Murphy, Nigel Calder, Nasser Mansour

Abstract:

This paper reports on an aspect of e-learning in developing inquiry-based learning (IBL). We present data on the views of teachers and students in Qatar following a professional development programme intended to help teachers implement IBL in their science and mathematics classrooms. Key to this programme was the use of WebQuests. Views of the teachers and students suggested that WebQuests helped students to develop technical skills, work collaboratively and become independent in their learning. The use of WebQuests also enabled a combination of digital and non-digital tools that helped students connect ideas and enhance their understanding of topics.

Keywords: digital technology, inquiry-based learning, mathematics and science education, professional development

Procedia PDF Downloads 126
8056 Characterization of Kopff Crater Using Remote Sensing Data

Authors: Shreekumari Patel, Prabhjot Kaur, Paras Solanki

Abstract:

Moon Mineralogy Mapper (M3), Miniature Radio Frequency (Mini-RF), Kaguya Terrain Camera images, Lunar Orbiter Laser Altimeter (LOLA) digital elevation model (DEM) and Lunar Reconnaissance Orbiter Camera (LROC)- Narrow angle camera (NAC) and Wide angle camera (WAC) images were used to study mineralogy, surface physical properties, and age of the 42 km diameter Kopff crater. M3 indicates the low albedo crater floor to be high-Ca pyroxene dominated associated with floor fracture suggesting the igneous activity of the gabbroic material. Signature of anorthositic material is sampled on the eastern edge as target material is excavated from ~3 km diameter impact crater providing access to the crustal composition. Several occurrences of spinel were detected in northwestern rugged terrain. Our observation can be explained by exposure of spinel by this crater that impacted onto the inner rings of Orientale basin. Spinel was part of the pre-impact target, an intrinsic unit of basin ring. Crater floor was dated by crater counts performed on Kaguya TC images. Nature of surface was studied in detail with LROC NAC and Mini-RF. Freshly exposed surface and boulder or debris seen in LROC NAC images have enhanced radar signal in comparison to mature terrain of Kopff crater. This multidisciplinary analysis of remote sensing data helps to assess lunar surface in detail.

Keywords: crater, mineralogy, moon, radar observations

Procedia PDF Downloads 148
8055 Solubility of Water in CO2 Mixtures at Pipeline Operation Conditions

Authors: Mohammad Ahmad, Sander Gersen, Erwin Wilbers

Abstract:

Carbon capture, transport and underground storage have become a major solution to reduce CO2 emissions from power plants and other large CO2 sources. A big part of this captured CO2 stream is transported at high pressure dense phase conditions and stored in offshore underground depleted oil and gas fields. CO2 is also transported in offshore pipelines to be used for enhanced oil and gas recovery. The captured CO2 stream with impurities may contain water that causes severe corrosion problems, flow assurance failure and might damage valves and instrumentations. Thus, free water formation should be strictly prevented. The purpose of this work is to study the solubility of water in pure CO2 and in CO2 mixtures under real pipeline pressure (90-150 bar) and temperature operation conditions (5-35°C). A set up was constructed to generate experimental data. The results show the solubility of water in CO2 mixtures increasing with the increase of the temperature or/and with the increase in pressure. A drop in water solubility in CO2 is observed in the presence of impurities. The data generated were then used to assess the capabilities of two mixture models: the GERG-2008 model and the EOS-CG model. By generating the solubility data, this study contributes to determine the maximum allowable water content in CO2 pipelines.

Keywords: carbon capture and storage, water solubility, equation of states, fluids engineering

Procedia PDF Downloads 281
8054 Expression of Gro-El under Phloem-Specific Promoter Protects Transgenic Plants against Diverse Begomovirus-Beta Satellite Complex

Authors: Muhammad Yousaf Ali, Shahid Mansoor, Javeria Qazi, Imran Amin, Musarrat Shaheen

Abstract:

Cotton leaf curl disease (CLCuD) is the major threat to the cotton crop and is transmitted by whitefly (Bemisia tabaci). Since multiple begomoviruses and associated satellites are involved in CLCuD, approaches based on the concept of broad-spectrum resistance are essential for effective disease control. Gro-El and G5 are two proteins from whitefly endosymbiont and M13 bacteriophage origin, respectively. Gro-El encapsulates the virus particle when it enters the whitefly and protects the virus from the immune system of the whitefly as well as prevents viral expression in it. This characteristic of Gro-El can be exploited to get resistance against viruses if expressed in plants. G5 is a single-stranded DNA binding protein, expression of which in transgenic plants will stop viral expression on its binding with ssDNA. The use of tissue-specific promoters is more efficient than constitutive promoters. Transgenics of Nicotiana benthamiana for Gro-El under constitutive promoter and Gro-El under phloem specific promoter were made. In comparison to non-transgenic plants, transgenic plants with Gro-El under NSP promoter showed promising results when challenged against cotton leaf curl Multan virus (CLCuMuV) along with cotton leaf curl Multan beta satellite (CLCuMB), cotton leaf curl Khokhran virus (CLCuKoV) along with cotton leaf curl Multan beta satellite (CLCuMB) and Pedilenthus leaf curl virus (PedLCV) along with Tobacco leaf curl beta satellite (TbLCB).

Keywords: cotton leaf curl disease, whitefly, endosymbionts, transgenic, resistance

Procedia PDF Downloads 80
8053 Does Indian Intellectual Property Policy Affect the U. S. Pharmaceutical Industry? A Comparative Study of Pfizer and Ranbaxy Laboratories in Regards to Trade Related Aspects of Intellectual Property Rights

Authors: Alina Hamid Bari

Abstract:

Intellectual Property (IP) policies of a country have a huge impact on the pharmaceutical industry as this industry is all about patents. Developed countries have used IP protection to boost their economy; developing countries are concerned about access to medicine for poor people. U.S. company, Pfizer had a monopoly for 14 years for Lipitor and it all came to end when Pfizer decided to operate in India. This research will focus at the effects of Indian IP policies on USA by comparing Pfizer & Ranbaxy with regards to Trade Related Aspects of Intellectual Property Rights. For this research inductive approach has been used. Main source of material is Annual reports, theory based on academic books and articles along with rulings of court, policy statements and decisions, websites and newspaper articles. SWOT analysis is done for both Pfizer & Ranbaxy. The main comparison was done by doing ratio analysis and analyses of annual reports for the year 2011-2012 for Pfizer and Ranbaxy to see the impact on their profitability. This research concludes that Indian intellectual laws do affect the profitability of the U.S. pharmaceutical industry which can in turn have an impact on the US economy. These days India is only granting patents on products which it feels are deserving of it. So the U.S. companies operating in India have to defend their invention to get a patent. Thus, to operate in India and maintain monopoly in market, US firms have to come up with different strategies.

Keywords: atorvastatin, India, intellectual property, lipitor, Pfizer, pharmaceutical industry, Ranbaxy, TRIPs, U.S.

Procedia PDF Downloads 463
8052 Rapid Identification and Diagnosis of the Pathogenic Leptospiras through Comparison among Culture, PCR and Real Time PCR Techniques from Samples of Human and Mouse Feces

Authors: S. Rostampour Yasouri, M. Ghane, M. Doudi

Abstract:

Leptospirosis is one of the most significant infectious and zoonotic diseases along with global spreading. This disease is causative agent of economoic losses and human fatalities in various countries, including Northern provinces of Iran. The aim of this research is to identify and compare the rapid diagnostic techniques of pathogenic leptospiras, considering the multifacetedness of the disease from a clinical manifestation and premature death of patients. In the spring and summer of 2020-2022, 25 fecal samples were collected from suspected leptospirosis patients and 25 Fecal samples from mice residing in the rice fields and factories in Tonekabon city. Samples were prepared by centrifugation and passing through membrane filters. Culture technique was used in liquid and solid EMJH media during one month of incubation at 30°C. Then, the media were examined microscopically. DNA extraction was conducted by extraction Kit. Diagnosis of leptospiras was enforced by PCR and Real time PCR (SYBR Green) techniques using lipL32 specific primer. Out of the patients, 11 samples (44%) and 8 samples (32%) were determined to be pathogenic Leptospira by Real time PCR and PCR technique, respectively. Out of the mice, 9 Samples (36%) and 3 samples (12%) were determined to be pathogenic Leptospira by the mentioned techniques, respectively. Although the culture technique is considered to be the gold standard technique, but due to the slow growth of pathogenic Leptospira and lack of colony formation of some species, it is not a fast technique. Real time PCR allowed rapid diagnosis with much higher accuracy compared to PCR because PCR could not completely identify samples with lower microbial load.

Keywords: culture, pathogenic leptospiras, PCR, real time PCR

Procedia PDF Downloads 70
8051 Effect of Joule Heating on Chemically Reacting Micropolar Fluid Flow over Truncated Cone with Convective Boundary Condition Using Spectral Quasilinearization Method

Authors: Pradeepa Teegala, Ramreddy Chetteti

Abstract:

This work emphasizes the effects of heat generation/absorption and Joule heating on chemically reacting micropolar fluid flow over a truncated cone with convective boundary condition. For this complex fluid flow problem, the similarity solution does not exist and hence using non-similarity transformations, the governing fluid flow equations along with related boundary conditions are transformed into a set of non-dimensional partial differential equations. Several authors have applied the spectral quasi-linearization method to solve the ordinary differential equations, but here the resulting nonlinear partial differential equations are solved for non-similarity solution by using a recently developed method called the spectral quasi-linearization method (SQLM). Comparison with previously published work on special cases of the problem is performed and found to be in excellent agreement. The influence of pertinent parameters namely Biot number, Joule heating, heat generation/absorption, chemical reaction, micropolar and magnetic field on physical quantities of the flow are displayed through graphs and the salient features are explored in detail. Further, the results are analyzed by comparing with two special cases, namely, vertical plate and full cone wherever possible.

Keywords: chemical reaction, convective boundary condition, joule heating, micropolar fluid, spectral quasilinearization method

Procedia PDF Downloads 333
8050 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 262
8049 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 67
8048 Role of Power Electronics in Grid Integration of Renewable Energy Systems

Authors: M. N. Tandjaoui, C. Banoudjafar, C. Benachaiba, O. Abdelkhalek, A. Kechich

Abstract:

Advanced power electronic systems are deemed to be an integral part of renewable, green, and efficient energy systems. Wind energy is one of the renewable means of electricity generation that is now the world’s fastest growing energy source can bring new challenges when it is connected to the power grid due to the fluctuation nature of the wind and the comparatively new types of its generators. The wind energy is part of the worldwide discussion on the future of energy generation and use and consequent effects on the environment. However, this paper will introduce some of the requirements and aspects of the power electronic involved with modern wind generation systems, including modern power electronics and converters, and the issues of integrating wind turbines into power systems.

Keywords: power electronics, renewable energy, smart grid, green energy, power technology

Procedia PDF Downloads 633
8047 Suitability Evaluation of Human Settlements Using a Global Sensitivity Analysis Method: A Case Study in of China

Authors: Feifei Wu, Pius Babuna, Xiaohua Yang

Abstract:

The suitability evaluation of human settlements over time and space is essential to track potential challenges towards suitable human settlements and provide references for policy-makers. This study established a theoretical framework of human settlements based on the nature, human, economy, society and residence subsystems. Evaluation indicators were determined with the consideration of the coupling effect among subsystems. Based on the extended Fourier amplitude sensitivity test algorithm, the global sensitivity analysis that considered the coupling effect among indicators was used to determine the weights of indicators. The human settlement suitability was evaluated at both subsystems and comprehensive system levels in 30 provinces of China between 2000 and 2016. The findings were as follows: (1) human settlements suitability index (HSSI) values increased significantly in all 30 provinces from 2000 to 2016. Among the five subsystems, the suitability index of the residence subsystem in China exhibited the fastest growinggrowth, fol-lowed by the society and economy subsystems. (2) HSSI in eastern provinces with a developed economy was higher than that in western provinces with an underdeveloped economy. In con-trast, the growing rate of HSSI in eastern provinces was significantly higher than that in western provinces. (3) The inter-provincial difference of in HSSI decreased from 2000 to 2016. For sub-systems, it decreased for the residence system, whereas it increased for the economy system. (4) The suitability of the natural subsystem has become a limiting factor for the improvement of human settlements suitability, especially in economically developed provinces such as Beijing, Shanghai, and Guangdong. The results can be helpful to support decision-making and policy for improving the quality of human settlements in a broad nature, human, economy, society and residence context.

Keywords: human settlements, suitability evaluation, extended fourier amplitude, human settlement suitability

Procedia PDF Downloads 62
8046 Lightweight Hybrid Convolutional and Recurrent Neural Networks for Wearable Sensor Based Human Activity Recognition

Authors: Sonia Perez-Gamboa, Qingquan Sun, Yan Zhang

Abstract:

Non-intrusive sensor-based human activity recognition (HAR) is utilized in a spectrum of applications, including fitness tracking devices, gaming, health care monitoring, and smartphone applications. Deep learning models such as convolutional neural networks (CNNs) and long short term memory (LSTM) recurrent neural networks (RNNs) provide a way to achieve HAR accurately and effectively. In this paper, we design a multi-layer hybrid architecture with CNN and LSTM and explore a variety of multi-layer combinations. Based on the exploration, we present a lightweight, hybrid, and multi-layer model, which can improve the recognition performance by integrating local features and scale-invariant with dependencies of activities. The experimental results demonstrate the efficacy of the proposed model, which can achieve a 94.7% activity recognition rate on a benchmark human activity dataset. This model outperforms traditional machine learning and other deep learning methods. Additionally, our implementation achieves a balance between recognition rate and training time consumption.

Keywords: deep learning, LSTM, CNN, human activity recognition, inertial sensor

Procedia PDF Downloads 132
8045 Executive Stock Options, Business Ethics and Financial Reporting Quality

Authors: Philemon Rakoto

Abstract:

This paper tests the improvement of financial reporting quality when firms award stock options to their executives. The originality of this study is that we introduce the moderating effect of business ethics in the model. The sample is made up of 116 Canadian high-technology firms with available data for the fiscal year ending in 2012. We define the quality of financial reporting as the value relevance of accounting information as developed by Ohlson. Our results show that executive stock option award alone does not improve the quality of financial reporting. Rather, the quality improves when a firm awards stock options to its executives and investors perceive that the level of business ethics in that firm is high.

Keywords: business ethics, Canada, high-tech firms, stock options, value relevance

Procedia PDF Downloads 474
8044 Online Battery Equivalent Circuit Model Estimation on Continuous-Time Domain Using Linear Integral Filter Method

Authors: Cheng Zhang, James Marco, Walid Allafi, Truong Q. Dinh, W. D. Widanage

Abstract:

Equivalent circuit models (ECMs) are widely used in battery management systems in electric vehicles and other battery energy storage systems. The battery dynamics and the model parameters vary under different working conditions, such as different temperature and state of charge (SOC) levels, and therefore online parameter identification can improve the modelling accuracy. This paper presents a way of online ECM parameter identification using a continuous time (CT) estimation method. The CT estimation method has several advantages over discrete time (DT) estimation methods for ECM parameter identification due to the widely separated battery dynamic modes and fast sampling. The presented method can be used for online SOC estimation. Test data are collected using a lithium ion cell, and the experimental results show that the presented CT method achieves better modelling accuracy compared with the conventional DT recursive least square method. The effectiveness of the presented method for online SOC estimation is also verified on test data.

Keywords: electric circuit model, continuous time domain estimation, linear integral filter method, parameter and SOC estimation, recursive least square

Procedia PDF Downloads 364
8043 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains

Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe

Abstract:

The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.

Keywords: digitalization, digital transformation, Industrie 4.0, lean production, value chain

Procedia PDF Downloads 296
8042 Biosynthesis of Healthy Secondary Metabolites in Olive Fruit in Response to Different Agronomic Treatments

Authors: Anna Perrone, Federico Martinelli

Abstract:

Olive fruit is well-known for the high content in secondary metabolites with high interest at nutritional, nutraceutical, antioxidant, and healthy levels. The content of secondary metabolites in olive at harvest may be affected by different water regimes, with significant effects on olive oil composition and quality and, consequently, on its healthy and nutritional features. In this work, a summary of several research studies dealing with the biosynthesis of healthy and nutraceutical metabolites of the secondary metabolism in olive fruit will be reported. The phytochemical findings have been correlated with the expression of key genes involved in polyphenol, terpenoid, and carotenoid biosynthesis and metabolism in response to different development stages and water regimes. Flavonoids were highest in immature fruits, while anthocyanins increased at ripening. In epicarp tissue, this was clearly associated with an up-regulation of the UFGT gene. Olive fruits cultivated under different water regimes were analyzed by metabolomics. This method identified several hundred metabolites in the ripe mesocarp. Among them, 46 were differentially accumulated in the comparison between rain-fed and irrigated conditions. Well-known healthy metabolites were more abundant at a higher level of water regimes. Increased content of polyphenols was observed in the rain-fed fruit; particularly, anthocyanin concentration was higher at ripening. Several secondary metabolites were differentially accumulated between different irrigation conditions. These results showed that these metabolic approaches could be efficiently used to determine the effects of agronomic treatments on olive fruit physiology and, consequently, on nutritional and healthy properties of the obtained extra-virgin olive oil.

Keywords: olea europea, anthocyanins, polyphenols, water regimes

Procedia PDF Downloads 135
8041 On Dialogue Systems Based on Deep Learning

Authors: Yifan Fan, Xudong Luo, Pingping Lin

Abstract:

Nowadays, dialogue systems increasingly become the way for humans to access many computer systems. So, humans can interact with computers in natural language. A dialogue system consists of three parts: understanding what humans say in natural language, managing dialogue, and generating responses in natural language. In this paper, we survey deep learning based methods for dialogue management, response generation and dialogue evaluation. Specifically, these methods are based on neural network, long short-term memory network, deep reinforcement learning, pre-training and generative adversarial network. We compare these methods and point out the further research directions.

Keywords: dialogue management, response generation, deep learning, evaluation

Procedia PDF Downloads 152
8040 6D Posture Estimation of Road Vehicles from Color Images

Authors: Yoshimoto Kurihara, Tad Gonsalves

Abstract:

Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.

Keywords: 6D posture estimation, image recognition, deep learning, AlexNet

Procedia PDF Downloads 135
8039 Comparative Study of IC and Perturb and Observe Method of MPPT Algorithm for Grid Connected PV Module

Authors: Arvind Kumar, Manoj Kumar, Dattatraya H. Nagaraj, Amanpreet Singh, Jayanthi Prattapati

Abstract:

The purpose of this paper is to study and compare two maximum power point tracking (MPPT) algorithms in a photovoltaic simulation system and also show a simulation study of maximum power point tracking (MPPT) for photovoltaic systems using perturb and observe algorithm and Incremental conductance algorithm. Maximum power point tracking (MPPT) plays an important role in photovoltaic systems because it maximize the power output from a PV system for a given set of conditions, and therefore maximize the array efficiency and minimize the overall system cost. Since the maximum power point (MPP) varies, based on the irradiation and cell temperature, appropriate algorithms must be utilized to track the (MPP) and maintain the operation of the system in it. MATLAB/Simulink is used to establish a model of photovoltaic system with (MPPT) function. This system is developed by combining the models established of solar PV module and DC-DC Boost converter. The system is simulated under different climate conditions. Simulation results show that the photovoltaic simulation system can track the maximum power point accurately.

Keywords: incremental conductance algorithm, perturb and observe algorithm, photovoltaic system, simulation results

Procedia PDF Downloads 541