Search results for: data source
26583 Medical Ethics: Knowledge, Attitude and Practices among Young Healthcare Professionals – A Survey from Islamabad, Pakistan
Authors: Asima Mehaboob Khan, Rizwan Taj
Abstract:
Purpose: This study aims to estimate the knowledge, attitude and practices of medical ethics among young healthcare professionals. Method: A qualitative descriptive study was conducted among young healthcare professionals from both public and private sector medical institutions. Using the convenience sampling technique, 272 healthcare professionals participated in this study. A pre-structured modified questionnaire was used to collect the data. Descriptive analyses were executed for each variable. Result: About 76.47% of healthcare professional considers the importance of adequate knowledge of medical ethics, and 82.24% declared lecture, seminars and clinical discussion as the source of their medical knowledge of biomedical ethics. About 42.44% of healthcare professionals exhibited a negative attitude toward medical ethics, 57.72% showed a mildly positive attitude, whereas 1.10% and 0.74% indicated a moderately positive attitude and a highly positive attitude towards medical ethics. Similarly, the level of practice according to medical ethics is also very poor among young healthcare professionals. 34.56% of healthcare professionals deviated from medical ethics during their clinical practices, whereas 0.74% showed a good level of medical practice according to medical ethics. Conclusion: It is concluded in this research study that young healthcare professionals have adequate theoretical knowledge of medical ethics but are not properly trained to perform their clinical practices according to the guidelines of medical ethics. Furthermore, their professional attitude is poorly developed to maintain medical ethics during their clinical practices.Keywords: knowledge, attitude, practices, medical ethics
Procedia PDF Downloads 10526582 Spatial and Time Variability of Ambient Vibration H/V Frequency Peak
Authors: N. Benkaci, E. Oubaiche, J.-L. Chatelain, R. Bensalem, K. Abbes
Abstract:
The ambient vibration H/V technique is widely used nowadays in microzonation studies, because of its easy field handling and its low cost, compared to other geophysical methods. However, in presence of complex geology or lateral heterogeneity evidenced by more than one peak frequency in the H/V curve, it is difficult to interpret the results, especially when soil information is lacking. In this work, we focus on the construction site of the Baraki 40000=place stadium, located in the north-east side of the Mitidja basin (Algeria), to identify the seismic wave amplification zones. H/V curve analysis leads to the observation of spatial and time variability of the H/V frequency peaks. The spatial variability allows dividing the studied area into three main zones: (1) one with a predominant frequency around 1,5 Hz showing an important amplification level, (2) the second exhibits two peaks at 1,5 Hz and in the 4 Hz – 10 Hz range, and (3) the third zone is characterized by a plateau between 2 Hz and 3 Hz. These H/V curve categories reveal a consequent lateral heterogeneity dividing the stadium site roughly in the middle. Furthermore, a continuous ambient vibration recording during several weeks allows showing that the first peak at 1,5 Hz in the second zone, completely disappears between 2 am and 4 am, and reaching its maximum amplitude around 12 am. Consequently, the anthropogenic noise source generating these important variations could be the Algiers Rocade Sud highway, located in the maximum amplification azimuth direction of the H/V curves. This work points out that the H/V method is an important tool to perform nano-zonation studies prior to geotechnical and geophysical investigations, and that, in some cases, the H/V technique fails to reveal the resonance frequency in the absence of strong anthropogenic source.Keywords: ambient vibrations, amplification, fundamental frequency, lateral heterogeneity, site effect
Procedia PDF Downloads 23726581 Phytochemical Analysis of Some Solanaceous Plants of Chandigarh
Authors: Nishtha, Richa, Anju Rao
Abstract:
Plants are the source of herbal medicine and medicinal value of the plants lies in the bioactive phytochemical constituents that produce definite physiological effects on human body. Angiospermic families are known to produce such phytochemical constituents which are termed as secondary plant metabolites. These metabolites include alkaloids, saponins, phenolic compounds, flavonoids, tannins, terpenoids and so on. Solanaceae is one of the important families of Angiosperms known for medicinally important alkaloids such as hyoscyamine, scopolamine, solanine, nicotine, capsaicin etc. Medicinally important species of this family mostly belong to the genera of Datura,Atropa,Solanum,Withania and Nicotiana.Six species such as Datura metel, Solanum torvum, Physalis minima, Cestrum nocturnum, Cestrum diurnum and Nicotiana plumbaginifolia have been collected from different localities of Chandigarh and adjoining areas.Field and anatomical studies helped to identify the plants and their parts used for the study of secondary plant metabolites. Preliminary phytochemcial studies have been done on various parts of plants such as roots, stem and leaves by making aqueous and alcoholic extracts from their powdered forms which showed the presence of alkaloids in almost all the species followed by steroids, flavonoids, terpenoids, tannins etc. HPLC profiles of leaves of Datura metel showed the presence of active compounds such as scopalamine and hyoscyamine and Solanum torvum showed the presence of solanine and solasodine. These alkaloids are important source of drug based medicine used in pharmacognosy. The respective compounds help in treating vomiting, nausea, respiratory disorders, dizziness, asthma and many heart problems.Keywords: alkaloids, flavanoids, phytochemical constituents, pharmacognosy, secondary metabolites
Procedia PDF Downloads 44826580 Effects of Climate Change and Land Use, Land Cover Change on Atmospheric Mercury
Authors: Shiliang Wu, Huanxin Zhang
Abstract:
Mercury has been well-known for its negative effects on wildlife, public health as well as the ecosystem. Once emitted into atmosphere, mercury can be transformed into different forms or enter the ecosystem through dry deposition or wet deposition. Some fraction of the mercury will be reemitted back into the atmosphere and be subject to the same cycle. In addition, the relatively long lifetime of elemental mercury in the atmosphere enables it to be transported long distances from source regions to receptor regions. Global change such as climate change and land use/land cover change impose significant challenges for mercury pollution control besides the efforts to regulate mercury anthropogenic emissions. In this study, we use a global chemical transport model (GEOS-Chem) to examine the potential impacts from changes in climate and land use/land cover on the global budget of mercury as well as its atmospheric transport, chemical transformation, and deposition. We carry out a suite of sensitivity model simulations to separate the impacts on atmospheric mercury associated with changes in climate and land use/land cover. Both climate change and land use/land cover change are found to have significant impacts on global mercury budget but through different pathways. Land use/land cover change primarily increase mercury dry deposition in northern mid-latitudes over continental regions and central Africa. Climate change enhances the mobilization of mercury from soil and ocean reservoir to the atmosphere. Also, dry deposition is enhanced over most continental areas while a change in future precipitation dominates the change in mercury wet deposition. We find that 2000-2050 climate change could increase the global atmospheric burden of mercury by 5% and mercury deposition by up to 40% in some regions. Changes in land use and land cover also increase mercury deposition over some continental regions, by up to 40%. The change in the lifetime of atmospheric mercury has important implications for long-range transport of mercury. Our case study shows that changes in climate and land use and cover could significantly affect the source-receptor relationships for mercury.Keywords: mercury, toxic pollutant, atmospheric transport, deposition, climate change
Procedia PDF Downloads 48926579 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices
Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu
Abstract:
Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction
Procedia PDF Downloads 10526578 Translating Silence: An Analysis of Dhofar University Student Translations of Elliptical Structures from English into Arabic
Authors: Ali Algryani
Abstract:
Ellipsis involves the omission of an item or items that can be recovered from the preceding clause. Ellipsis is used as a cohesion marker; it enhances the cohesiveness of a text/discourse as a clause is interpretable only through making reference to an antecedent clause. The present study attempts to investigate the linguistic phenomenon of ellipsis from a translation perspective. It is mainly concerned with how ellipsis is translated from English into Arabic. The study covers different forms of ellipsis, such as noun phrase ellipsis, verb phrase ellipsis, gapping, pseudo-gapping, stripping, and sluicing. The primary aim of the study, apart from discussing the use and function of ellipsis, is to find out how such ellipsis phenomena are dealt with in English-Arabic translation and determine the implications of the translations of elliptical structures into Arabic. The study is based on the analysis of Dhofar University (DU) students' translations of sentences containing different forms of ellipsis. The initial findings of the study indicate that due to differences in syntactic structures and stylistic preferences between English and Arabic, Arabic tends to use lexical repetition in the translation of some elliptical structures, thus achieving a higher level of explicitness. This implies that Arabic tends to prefer lexical repetition to create cohesion more than English does. Furthermore, the study also reveals that the improper translation of ellipsis leads to interpretations different from those understood from the source text. Such mistranslations can be attributed to student translators’ lack of awareness of the use and function of ellipsis as well as the stylistic preferences of both languages. This has pedagogical implications on the teaching and training of translation students at DU. Students' linguistic competence needs to be enhanced through teaching linguistics-related issues with reference to translation and both languages, .i.e. source and target languages and with special emphasis on their use, function and stylistic preferences.Keywords: cohesion, ellipsis, explicitness, lexical repetition
Procedia PDF Downloads 12426577 Investigating Sediment-Bound Chemical Transport in an Eastern Mediterranean Perennial Stream to Identify Priority Pollution Sources on a Catchment Scale
Authors: Felicia Orah Rein Moshe
Abstract:
Soil erosion has become a priority global concern, impairing water quality and degrading ecosystem services. In Mediterranean climates, following a long dry period, the onset of rain occurs when agricultural soils are often bare and most vulnerable to erosion. Early storms transport sediments and sediment-bound pollutants into streams, along with dissolved chemicals. This results in loss of valuable topsoil, water quality degradation, and potentially expensive dredged-material disposal costs. Information on the provenance of fine sediment and priority sources of adsorbed pollutants represents a critical need for developing effective control strategies aimed at source reduction. Modifying sediment traps designed for marine systems, this study tested a cost-effective method to collect suspended sediments on a catchment scale to characterize stream water quality during first-flush storm events in a flashy Eastern Mediterranean coastal perennial stream. This study investigated the Kishon Basin, deploying sediment traps in 23 locations, including 4 in the mainstream and one downstream in each of 19 tributaries, enabling the characterization of sediment as a vehicle for transporting chemicals. Further, it enabled direct comparison of sediment-bound pollutants transported during the first-flush winter storms of 2020 from each of 19 tributaries, allowing subsequent ecotoxicity ranking. Sediment samples were successfully captured in 22 locations. Pesticides, pharmaceuticals, nutrients, and metal concentrations were quantified, identifying a total of 50 pesticides, 15 pharmaceuticals, and 22 metals, with 16 pesticides and 3 pharmaceuticals found in all 23 locations, demonstrating the importance of this transport pathway. Heavy metals were detected in only one tributary, identifying an important watershed pollution source with immediate potential influence on long-term dredging costs. Simultaneous sediment sampling at first flush storms enabled clear identification of priority tributaries and their chemical contributions, advancing a new national watershed monitoring approach, facilitating strategic plan development based on source reduction, and advancing the goal of improving the farm-stream interface, conserving soil resources, and protecting water quality.Keywords: adsorbed pollution, dredged material, heavy metals, suspended sediment, water quality monitoring
Procedia PDF Downloads 10826576 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma
Abstract:
Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.Keywords: support vector mechanism (SVM), machine learning (ML), support vector machines (SVM), department of transportation (DFT)
Procedia PDF Downloads 27426575 The Display of Age-Period/Age-Cohort Mortality Trends Using 1-Year Intervals Reveals Period and Cohort Effects Coincident with Major Influenza A Events
Authors: Maria Ines Azambuja
Abstract:
Graphic displays of Age-Period-Cohort (APC) mortality trends generally uses data aggregated within 5 or 10-year intervals. Technology allows one to increase the amount of processed data. Displaying occurrences by 1-year intervals is a logic first step in the direction of attaining higher quality landscapes of variations in temporal occurrences. Method: 1) Comparison of UK mortality trends plotted by 10-, 5- and 1-year intervals; 2) Comparison of UK and US mortality trends (period X age and cohort X age) displayed by 1-year intervals. Source: Mortality data (period, 1x1, males, 1933-1912) uploaded from the Human Mortality Database to Excel files, where Period X Age and Cohort X Age graphics were produced. The choice of transforming age-specific trends from calendar to birth-cohort years (cohort = period – age) (instead of using cohort 1x1 data available at the HMD resource) was taken to facilitate the comparison of age-specific trends when looking across calendar-years and birth-cohorts. Yearly live births, males, 1933 to 1912 (UK) were uploaded from the HFD. Influenza references are from the literature. Results: 1) The use of 1-year intervals unveiled previously unsuspected period, cohort and interacting period x cohort effects upon all-causes mortality. 2) The UK and US figures showed variations associated with particular calendar years (1936, 1940, 1951, 1957-68, 72) and, most surprisingly, with particular birth-cohorts (1889-90 in the US, and 1900, 1918-19, 1940-41 and 1946-47, in both countries. Also, the figures showed ups and downs in age-specific trends initiated at particular birth-cohorts (1900, 1918-19 and 1947-48) or a particular calendar-year (1968, 1972, 1977-78 in the US), variations at times restricted to just a range of ages (cohort x period interacting effects). Importantly, most of the identified “scars” (period and cohort) correlates with the record of occurrences of Influenza A epidemics since the late 19th Century. Conclusions: The use of 1-year intervals to describe APC mortality trends both increases the amount of information available, thus enhancing the opportunities for patterns’ recognition, and increases our capability of interpreting those patterns by describing trends across smaller intervals of time (period or birth-cohort). The US and the UK mortality landscapes share many but not all 'scars' and distortions suggested here to be associated with influenza epidemics. Different size-effects of wars are evident, both in mortality and in fertility. But it would also be realistic to suppose that the preponderant influenza A viruses circulating in UK and US at the beginning of the 20th Century might be different and the difference to have intergenerational long-term consequences. Compared with the live births trend (UK data), birth-cohort scars clearly depend on birth-cohort sizes relatives to neighbor ones, which, if causally associated with influenza, would result from influenza-related fetal outcomes/selection. Fetal selection could introduce continuing modifications on population patterns of immune-inflammatory phenotypes that might give rise to 'epidemic constitutions' favoring the occurrence of particular diseases. Comparative analysis of mortality landscapes may help us to straight our record of past circulation of Influenza viruses and document associations between influenza recycling and fertility changes.Keywords: age-period-cohort trends, epidemic constitution, fertility, influenza, mortality
Procedia PDF Downloads 23026574 A Relational Data Base for Radiation Therapy
Authors: Raffaele Danilo Esposito, Domingo Planes Meseguer, Maria Del Pilar Dorado Rodriguez
Abstract:
As far as we know, it is still unavailable a commercial solution which would allow to manage, openly and configurable up to user needs, the huge amount of data generated in a modern Radiation Oncology Department. Currently, available information management systems are mainly focused on Record & Verify and clinical data, and only to a small extent on physical data. Thus, results in a partial and limited use of the actually available information. In the present work we describe the implementation at our department of a centralized information management system based on a web server. Our system manages both information generated during patient planning and treatment, and information of general interest for the whole department (i.e. treatment protocols, quality assurance protocols etc.). Our objective it to be able to analyze in a simple and efficient way all the available data and thus to obtain quantitative evaluations of our treatments. This would allow us to improve our work flow and protocols. To this end we have implemented a relational data base which would allow us to use in a practical and efficient way all the available information. As always we only use license free software.Keywords: information management system, radiation oncology, medical physics, free software
Procedia PDF Downloads 24126573 A Study of Safety of Data Storage Devices of Graduate Students at Suan Sunandha Rajabhat University
Authors: Komol Phaisarn, Natcha Wattanaprapa
Abstract:
This research is a survey research with an objective to study the safety of data storage devices of graduate students of academic year 2013, Suan Sunandha Rajabhat University. Data were collected by questionnaire on the safety of data storage devices according to CIA principle. A sample size of 81 was drawn from population by purposive sampling method. The results show that most of the graduate students of academic year 2013 at Suan Sunandha Rajabhat University use handy drive to store their data and the safety level of the devices is at good level.Keywords: security, safety, storage devices, graduate students
Procedia PDF Downloads 35326572 Statistical Discrimination of Blue Ballpoint Pen Inks by Diamond Attenuated Total Reflectance (ATR) FTIR
Authors: Mohamed Izzharif Abdul Halim, Niamh Nic Daeid
Abstract:
Determining the source of pen inks used on a variety of documents is impartial for forensic document examiners. The examination of inks is often performed to differentiate between inks in order to evaluate the authenticity of a document. A ballpoint pen ink consists of synthetic dyes in (acidic and/or basic), pigments (organic and/or inorganic) and a range of additives. Inks of similar color may consist of different composition and are frequently the subjects of forensic examinations. This study emphasizes on blue ballpoint pen inks available in the market because it is reported that approximately 80% of questioned documents analysis involving ballpoint pen ink. Analytical techniques such as thin layer chromatography, high-performance liquid chromatography, UV-vis spectroscopy, luminescence spectroscopy and infrared spectroscopy have been used in the analysis of ink samples. In this study, application of Diamond Attenuated Total Reflectance (ATR) FTIR is straightforward but preferable in forensic science as it offers no sample preparation and minimal analysis time. The data obtained from these techniques were further analyzed using multivariate chemometric methods which enable extraction of more information based on the similarities and differences among samples in a dataset. It was indicated that some pens from the same manufactures can be similar in composition, however, discrete types can be significantly different.Keywords: ATR FTIR, ballpoint, multivariate chemometric, PCA
Procedia PDF Downloads 45726571 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment
Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah
Abstract:
Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.Keywords: response time, query, consistency, bandwidth, storage capacity, CERN
Procedia PDF Downloads 27126570 Enhanced Growth of Microalgae Chlamydomonas reinhardtii Cultivated in Different Organic Waste and Effective Conversion of Algal Oil to Biodiesel
Authors: Ajith J. Kings, L. R. Monisha Miriam, R. Edwin Raj, S. Julyes Jaisingh, S. Gavaskar
Abstract:
Microalgae are a potential bio-source for rejuvenated solutions in various disciplines of science and technology, especially in medicine and energy. Biodiesel is being replaced for conventional fuels in automobile industries with reduced pollution and equivalent performance. Since it is a carbon neutral fuel by recycling CO2 in photosynthesis, global warming potential can be held in control using this fuel source. One of the ways to meet the rising demand of automotive fuel is to adopt with eco-friendly, green alternative fuels called sustainable microalgal biodiesel. In this work, a microalga Chlamydomonas reinhardtii was cultivated and optimized in different media compositions developed from under-utilized waste materials in lab scale. Using the optimized process conditions, they are then mass propagated in out-door ponds, harvested, dried and oils extracted for optimization in ambient conditions. The microalgal oil was subjected to two step esterification processes using acid catalyst to reduce the acid value (0.52 mg kOH/g) in the initial stage, followed by transesterification to maximize the biodiesel yield. The optimized esterification process parameters are methanol/oil ratio 0.32 (v/v), sulphuric acid 10 vol.%, duration 45 min at 65 ºC. In the transesterification process, commercially available alkali catalyst (KOH) is used and optimized to obtain a maximum biodiesel yield of 95.4%. The optimized parameters are methanol/oil ratio 0.33(v/v), alkali catalyst 0.1 wt.%, duration 90 min at 65 ºC 90 with smooth stirring. Response Surface Methodology (RSM) is employed as a tool for optimizing the process parameters. The biodiesel was then characterized with standard procedures and especially by GC-MS to confirm its compatibility for usage in internal combustion engine.Keywords: microalgae, organic media, optimization, transesterification, characterization
Procedia PDF Downloads 23426569 Prompt Design for Code Generation in Data Analysis Using Large Language Models
Authors: Lu Song Ma Li Zhi
Abstract:
With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.Keywords: large language models, prompt design, data analysis, code generation
Procedia PDF Downloads 3926568 Open Circuit MPPT Control Implemented for PV Water Pumping System
Authors: Rabiaa Gammoudi, Najet Rebei, Othman Hasnaoui
Abstract:
Photovoltaic systems use different techniques for tracking the Maximum Power Point (MPPT) to provide the highest possible power to the load regardless of the climatic conditions variation. In this paper, the proposed method is the Open Circuit (OC) method with sudden and random variations of insolation. The simulation results of the water pumping system controlled by OC method are validated by an experimental experience in real-time using a test bench composed by a centrifugal pump powered by a PVG via a boost chopper for the adaptation between the source and the load. The output of the DC/DC converter supplies the motor pump LOWARA type, assembly by means of a DC/AC inverter. The control part is provided by a computer incorporating a card DS1104 running environment Matlab/Simulink for visualization and data acquisition. These results show clearly the effectiveness of our control with a very good performance. The results obtained show the usefulness of the developed algorithm in solving the problem of degradation of PVG performance depending on the variation of climatic factors with a very good yield.Keywords: PVWPS (PV Water Pumping System), maximum power point tracking (MPPT), open circuit method (OC), boost converter, DC/AC inverter
Procedia PDF Downloads 45426567 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece
Authors: N. Samarinas, C. Evangelides, C. Vrekos
Abstract:
The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.Keywords: classification, fuzzy logic, tolerance relations, rainfall data
Procedia PDF Downloads 31426566 The Quasar 3C 47:Extreme Population B Jetted Source with Double-Peaked Profile
Authors: Shimeles Terefe Mengistue, Paola Marziani, Ascensióndel Olmo, Jaime Perea, Mirjana Pović
Abstract:
The theory that rotating accretion disks are responsible for the broad emission-line profiles in quasars is frequently put forth; however, the presence of accretion disk (AD) in active galactic nuclei (AGN) had limited and indirect observational support. In order to evaluate the extent to which the AD is a source of the broad Balmer lines and high ionization UV lines in radio-loud (RL) AGN, we focused on an extremely jetted RL quasar, 3C 47 that clearly shows a double peaked profile. This work presents its optical spectra and UV observations from the HST/FOS covering the rest-frame spectral range from 2000 to 7000 \AA. The fit of the low ionization lines, Hbeta, Halpha and MgII2800 show profiles that are in very good agreement with a relativistic Keplerian AD model. The profile of the prototypical high ionization lines can also be modeled by the contribution of the AD, with additional components due to outflows and emissions from the innermost part of the narrow line regions (NLRs). A prominent fit of the resulting double peaked profiles were found and very important disk parameters of the disk have been determined using the Hbeta, Halpha and MgII2800 lines: the inner and outer radii (both in units of G/mbh, where mbh is the supermassive black hole), an inclination to the line of sight, the emissivity index and the local broadening parameter. In addition, the accretion parameters, /mbh and /lledd are also determined. This work indicates that the line profile of 3C 47 shows the most convincing direct evidence for the presence of a rotating AD in AGN and the broad, double-peaked profiles originate from this AD that surrounds an /mbh.Keywords: active galactic nuclei, quasars, emission lines, Double-peaked, supermassive black hole
Procedia PDF Downloads 7526565 ZnO Nanoparticles as Photocatalysts: Synthesis, Characterization and Application
Authors: Pachari Chuenta, Suwat Nanan
Abstract:
ZnO nanostructures have been synthesized successfully in high yield via catalyst-free chemical precipitation technique by varying zinc source (either zinc nitrate or zinc acetate) and oxygen source (either oxalic acid or urea) without using any surfactant, organic solvent or capping agent. The ZnO nanostructures were characterized by Fourier transform infrared spectroscopy (FT-IR), X-ray diffractometry (XRD), scanning electron microscopy (SEM), thermal gravimetric analysis (TGA), UV-vis diffuse reflection spectroscopy (UV-vis DRS), and photoluminescence spectroscopy (PL). The FTIR peak in the range of 450-470 cm-1 corresponded to Zn-O stretching in ZnO structure. The synthesized ZnO samples showed well crystalized hexagonal wurtzite structure. SEM micrographs displayed spherical droplet of about 50-100 nm. The band gap of prepared ZnO was found to be 3.4-3.5 eV. The presence of PL peak at 468 nm was attributed to surface defect state. The photocatalytic activity of ZnO was studied by monitoring the photodegradation of reactive red (RR141) azo dye under ultraviolet (UV) light irradiation. Blank experiment was also separately carried out by irradiating the aqueous solution of the dye in absence of the photocatalyst. The initial concentration of the dye was fixed at 10 mgL-1. About 50 mg of ZnO photocatalyst was dispersed in 200 mL dye solution. The sample was collected at a regular time interval during the irradiation and then was analyzed after centrifugation. The concentration of the dye was determined by monitoring the absorbance at its maximum wavelength (λₘₐₓ) of 544 nm using UV-vis spectroscopic analysis technique. The sources of Zn and O played an important role on photocatalytic performance of the ZnO photocatalyst. ZnO nanoparticles which prepared by zinc acetate and oxalic acid at molar ratio of 1:1 showed high photocatalytic performance of about 97% toward photodegradation of reactive red azo dye (RR141) under UV light irradiation for only 60 min. This work demonstrates the promising potential of ZnO nanomaterials as photocatalysts for environmental remediation.Keywords: azo dye, chemical precipitation, photocatalytic, ZnO
Procedia PDF Downloads 14426564 Integrating Building Information Modeling into Facilities Management Operations
Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi
Abstract:
Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.Keywords: building information modeling, facility management, operational phase, building life cycle
Procedia PDF Downloads 15526563 A Low Cost Education Proposal Using Strain Gauges and Arduino to Develop a Balance
Authors: Thais Cavalheri Santos, Pedro Jose Gabriel Ferreira, Alexandre Daliberto Frugoli, Lucio Leonardo, Pedro Americo Frugoli
Abstract:
This paper presents a low cost education proposal to be used in engineering courses. The engineering education in universities of a developing country that is in need of an increasing number of engineers carried out with quality and affordably, pose a difficult problem to solve. In Brazil, the political and economic scenario requires academic managers able to reduce costs without compromising the quality of education. Within this context, the elaboration of a physics principles teaching method with the construction of an electronic balance is proposed. First, a method to develop and construct a load cell through which the students can understand the physical principle of strain gauges and bridge circuit will be proposed. The load cell structure was made with aluminum 6351T6, in dimensions of 80 mm x 13 mm x 13 mm and for its instrumentation, a complete Wheatstone Bridge was assembled with strain gauges of 350 ohms. Additionally, the process involves the use of a software tool to document the prototypes (design circuits), the conditioning of the signal, a microcontroller, C language programming as well as the development of the prototype. The project also intends to use an open-source I/O board (Arduino Microcontroller). To design the circuit, the Fritizing software will be used and, to program the controller, an open-source software named IDE®. A load cell was chosen because strain gauges have accuracy and their use has several applications in the industry. A prototype was developed for this study, and it confirmed the affordability of this educational idea. Furthermore, the goal of this proposal is to motivate the students to understand the several possible applications in high technology of the use of load cells and microcontroller.Keywords: Arduino, load cell, low-cost education, strain gauge
Procedia PDF Downloads 30326562 All-Optical Gamma-Rays and Positrons Source by Ultra-Intense Laser Irradiating an Al Cone
Authors: T. P. Yu, J. J. Liu, X. L. Zhu, Y. Yin, W. Q. Wang, J. M. Ouyang, F. Q. Shao
Abstract:
A strong electromagnetic field with E>1015V/m can be supplied by an intense laser such as ELI and HiPER in the near future. Exposing in such a strong laser field, laser-matter interaction enters into the near quantum electrodynamics (QED) regime and highly non-linear physics may occur during the laser-matter interaction. Recently, the multi-photon Breit-Wheeler (BW) process attracts increasing attention because it is capable to produce abundant positrons and it enhances the positron generation efficiency significantly. Here, we propose an all-optical scheme for bright gamma rays and dense positrons generation by irradiating a 1022 W/cm2 laser pulse onto an Al cone filled with near-critical-density plasmas. Two-dimensional (2D) QED particle-in-cell (PIC) simulations show that, the radiation damping force becomes large enough to compensate for the Lorentz force in the cone, causing radiation-reaction trapping of a dense electron bunch in the laser field. The trapped electrons oscillate in the laser electric field and emits high-energy gamma photons in two ways: (1) nonlinear Compton scattering due to the oscillation of electrons in the laser fields, and (2) Compton backwardscattering resulting from the bunch colliding with the reflected laser by the cone tip. The multi-photon Breit-Wheeler process is thus initiated and abundant electron-positron pairs are generated with a positron density ~1027m-3. The scheme is finally demonstrated by full 3D PIC simulations, which indicate the positron flux is up to 109. This compact gamma ray and positron source may have promising applications in future.Keywords: BW process, electron-positron pairs, gamma rays emission, ultra-intense laser
Procedia PDF Downloads 26026561 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction
Authors: S. Anastasiou, C. Nathanailides
Abstract:
The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services were gathered from relevant published works which included data from five different countries. The reviewed data indicate a significant correlation between indicators of customer and employee satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05) The reviewed data provide evidence that there is some practical evidence which links these two parameters.Keywords: job satisfaction, job performance, customer’ service, banks, human resources management
Procedia PDF Downloads 32126560 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.Keywords: attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation
Procedia PDF Downloads 17526559 Predicting Long-Term Meat Productivity for the Kingdom of Saudi Arabia
Authors: Ahsan Abdullah, Ahmed A. S. Bakshwain
Abstract:
Livestock is one of the fastest-growing sectors in agriculture. If carefully managed, have potential opportunities for economic growth, food sovereignty and food security. In this study we mainly analyse and compare long-term i.e. for year 2030 climate variability impact on predicted productivity of meat i.e. beef, mutton and poultry for the Kingdom of Saudi Arabia w.r.t three factors i.e. i) climatic-change vulnerability ii) CO2 fertilization and iii) water scarcity and compare the results with two countries of the region i.e. Iraq and Yemen. We do the analysis using data from diverse sources, which was extracted, transformed and integrated before usage. The collective impact of the three factors had an overall negative effect on the production of meat for all the three countries, with adverse impact on Iraq. High similarity was found between CO2 fertilization (effecting animal fodder) and water scarcity i.e. higher than that between production of beef and mutton for the three countries considered. Overall, the three factors do not seem to be favorable for the three Middle-East countries considered. This points to possibility of a vegetarian year 2030 based on dependency on indigenous live-stock population.Keywords: prediction, animal-source foods, pastures, CO2 fertilization, climatic-change vulnerability, water scarcity
Procedia PDF Downloads 32126558 Evaluation of Australian Open Banking Regulation: Balancing Customer Data Privacy and Innovation
Authors: Suman Podder
Abstract:
As Australian ‘Open Banking’ allows customers to share their financial data with accredited Third-Party Providers (‘TPPs’), it is necessary to evaluate whether the regulators have achieved the balance between protecting customer data privacy and promoting data-related innovation. Recognising the need to increase customers’ influence on their own data, and the benefits of data-related innovation, the Australian Government introduced ‘Consumer Data Right’ (‘CDR’) to the banking sector through Open Banking regulation. Under Open Banking, TPPs can access customers’ banking data that allows the TPPs to tailor their products and services to meet customer needs at a more competitive price. This facilitated access and use of customer data will promote innovation by providing opportunities for new products and business models to emerge and grow. However, the success of Open Banking depends on the willingness of the customers to share their data, so the regulators have augmented the protection of data by introducing new privacy safeguards to instill confidence and trust in the system. The dilemma in policymaking is that, on the one hand, lenient data privacy laws will help the flow of information, but at the risk of individuals’ loss of privacy, on the other hand, stringent laws that adequately protect privacy may dissuade innovation. Using theoretical and doctrinal methods, this paper examines whether the privacy safeguards under Open Banking will add to the compliance burden of the participating financial institutions, resulting in the undesirable effect of stifling other policy objectives such as innovation. The contribution of this research is three-fold. In the emerging field of customer data sharing, this research is one of the few academic studies on the objectives and impact of Open Banking in the Australian context. Additionally, Open Banking is still in the early stages of implementation, so this research traces the evolution of Open Banking through policy debates regarding the desirability of customer data-sharing. Finally, the research focuses not only on the customers’ data privacy and juxtaposes it with another important objective of promoting innovation, but it also highlights the critical issues facing the data-sharing regime. This paper argues that while it is challenging to develop a regulatory framework for protecting data privacy without impeding innovation and jeopardising yet unknown opportunities, data privacy and innovation promote different aspects of customer welfare. This paper concludes that if a regulation is appropriately designed and implemented, the benefits of data-sharing will outweigh the cost of compliance with the CDR.Keywords: consumer data right, innovation, open banking, privacy safeguards
Procedia PDF Downloads 14026557 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25226556 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 34126555 Solar-Assisted City Bus Electrical Installation: Opportunities and Impact on the Environment in Sydney
Authors: M. J. Geca, T. Tulwin, A. Majczak
Abstract:
On-board electricity consumption in the diesel city bus during operation is an important energy source. Electricity is generated by a combustion engine-driven alternator. Increased fuel consumption to generate on-board electricity in the bus has a negative impact on the emission of toxic components and carbon dioxide. At the same time, the bus roof surface allows placing a set of lightweight photovoltaic panels with power from 1 to 1.5 kW. The article presents an experimental study of electricity consumption of a city bus with diesel engine equipped with photovoltaic installation. The stream of electricity consumed by the bus and generated by a standard alternator and PV system was recorded. Base on the experimental research carried out in central Europe; the article analyses the impact of an additional source of electricity in the form of a photovoltaic installation on fuel consumption and emissions of toxic components of vehicles located in the latitude of Sydney. In Poland, the maximum global value of horizontal irradiation GHI is 1150 kWh/m², while for Sydney 1652 kWh/m². In addition, the profile of temperature and sunshine per year is different for these two different latitudes as presented in the article. Electricity generated directly from the sun powers the bus's electrical receivers. The photovoltaic system is able to replace 23% of annual electricity consumption, which at the same time will reduce 4% of fuel consumption and CO₂ reduction. Approximately 25% of the light is lost during vehicle traffic in Sydney latitude. The temperature losses of photovoltaic panels are comparable due to the cooling during vehicle motion. Acknowledgement: The project/research was financed in the framework of the project Lublin University of Technology - Regional Excellence Initiative, funded by the Polish Ministry of Science and Higher Education (contract no. 030/RID/2018/19).Keywords: electric energy, photovoltaic system, fuel consumption, CO₂
Procedia PDF Downloads 11326554 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System
Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad
Abstract:
The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3
Procedia PDF Downloads 204