Search results for: data-mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6485

Search results for: data-mining techniques

1205 Modeling Karachi Dengue Outbreak and Exploration of Climate Structure

Authors: Syed Afrozuddin Ahmed, Junaid Saghir Siddiqi, Sabah Quaiser

Abstract:

Various studies have reported that global warming causes unstable climate and many serious impact to physical environment and public health. The increasing incidence of dengue incidence is now a priority health issue and become a health burden of Pakistan. In this study it has been investigated that spatial pattern of environment causes the emergence or increasing rate of dengue fever incidence that effects the population and its health. The climatic or environmental structure data and the Dengue Fever (DF) data was processed by coding, editing, tabulating, recoding, restructuring in terms of re-tabulating was carried out, and finally applying different statistical methods, techniques, and procedures for the evaluation. Five climatic variables which we have studied are precipitation (P), Maximum temperature (Mx), Minimum temperature (Mn), Humidity (H) and Wind speed (W) collected from 1980-2012. The dengue cases in Karachi from 2010 to 2012 are reported on weekly basis. Principal component analysis is applied to explore the climatic variables and/or the climatic (structure) which may influence in the increase or decrease in the number of dengue fever cases in Karachi. PC1 for all the period is General atmospheric condition. PC2 for dengue period is contrast between precipitation and wind speed. PC3 is the weighted difference between maximum temperature and wind speed. PC4 for dengue period contrast between maximum and wind speed. Negative binomial and Poisson regression model are used to correlate the dengue fever incidence to climatic variable and principal component score. Relative humidity is estimated to positively influence on the chances of dengue occurrence by 1.71% times. Maximum temperature positively influence on the chances dengue occurrence by 19.48% times. Minimum temperature affects positively on the chances of dengue occurrence by 11.51% times. Wind speed is effecting negatively on the weekly occurrence of dengue fever by 7.41% times.

Keywords: principal component analysis, dengue fever, negative binomial regression model, poisson regression model

Procedia PDF Downloads 421
1204 The Impact of the Plagal Cadence on Nineteenth-Century Music

Authors: Jason Terry

Abstract:

Beginning in the mid-nineteenth century, hymns in the Anglo-American tradition often ended with the congregation singing ‘amen,’ most commonly set to a plagal cadence. While the popularity of this tradition is well-known still today, this research presents the origins of this custom. In 1861, Hymns Ancient & Modern deepened this convention by concluding each of its hymns with a published plagal-amen cadence. Subsequently, hymnals from a variety of denominations throughout Europe and the United States heavily adopted this practice. By the middle of the twentieth century the number of participants singing this cadence had suspiciously declined; however, it was not until the 1990s that the plagal-amen cadence all but disappeared from hymnals. Today, it is rare for songs to conclude with the plagal-amen cadence, although instrumentalists have continued to regularly play a plagal cadence underneath the singers’ sustained finalis. After examining a variety of music theory treatises, eighteenth-century newspaper articles, manuscripts & hymnals from the last five centuries, and conducting interviews with a number of scholars around the world, this study presents the context of the plagal-amen cadence through its history. The association of ‘amen’ and the plagal cadence was already being discussed during the late eighteenth century, and the plagal-amen cadence only grew in attractiveness from that time forward, most notably in the nineteenth and twentieth centuries. Throughout this research, the music of Thomas Tallis, primarily through his Preces and Responses, is reasonably shown to be the basis for the high status of the plagal-amen cadence in nineteenth- and twentieth-century society. Tallis’s immediate influence was felt among his contemporary English composers as well as posterity, all of whom were well-aware of his compositional styles and techniques. More importantly, however, was the revival of his music in nineteenth-century England, which had a greater impact on the plagal-amen tradition. With his historical title as the father of English cathedral music, Tallis was favored by the supporters of the Oxford Movement. Thus, with society’s view of Tallis, the simple IV–I cadence he chose to pair with ‘amen’ attained a much greater worth in the history of Western music. A musical device such as the once-revered plagal-amen cadence deserves to be studied and understood in a more factual light than has thus far been available to contemporary scholars.

Keywords: amen cadence, Plagal-amen cadence, singing hymns with amen, Thomas Tallis

Procedia PDF Downloads 203
1203 Discriminating Between Energy Drinks and Sports Drinks Based on Their Chemical Properties Using Chemometric Methods

Authors: Robert Cazar, Nathaly Maza

Abstract:

Energy drinks and sports drinks are quite popular among young adults and teenagers worldwide. Some concerns regarding their health effects – particularly those of the energy drinks - have been raised based on scientific findings. Differentiating between these two types of drinks by means of their chemical properties seems to be an instructive task. Chemometrics provides the most appropriate strategy to do so. In this study, a discrimination analysis of the energy and sports drinks has been carried out applying chemometric methods. A set of eleven samples of available commercial brands of drinks – seven energy drinks and four sports drinks – were collected. Each sample was characterized by eight chemical variables (carbohydrates, energy, sugar, sodium, pH, degrees Brix, density, and citric acid). The data set was standardized and examined by exploratory chemometric techniques such as clustering and principal component analysis. As a preliminary step, a variable selection was carried out by inspecting the variable correlation matrix. It was detected that some variables are redundant, so they can be safely removed, leaving only five variables that are sufficient for this analysis. They are sugar, sodium, pH, density, and citric acid. Then, a hierarchical clustering `employing the average – linkage criterion and using the Euclidian distance metrics was performed. It perfectly separates the two types of drinks since the resultant dendogram, cut at the 25% similarity level, assorts the samples in two well defined groups, one of them containing the energy drinks and the other one the sports drinks. Further assurance of the complete discrimination is provided by the principal component analysis. The projection of the data set on the first two principal components – which retain the 71% of the data information – permits to visualize the distribution of the samples in the two groups identified in the clustering stage. Since the first principal component is the discriminating one, the inspection of its loadings consents to characterize such groups. The energy drinks group possesses medium to high values of density, citric acid, and sugar. The sports drinks group, on the other hand, exhibits low values of those variables. In conclusion, the application of chemometric methods on a data set that features some chemical properties of a number of energy and sports drinks provides an accurate, dependable way to discriminate between these two types of beverages.

Keywords: chemometrics, clustering, energy drinks, principal component analysis, sports drinks

Procedia PDF Downloads 87
1202 The Impact of WhatsApp Groups as Supportive Technology in Teaching

Authors: Pinn Tsin Isabel Yee

Abstract:

With the advent of internet technologies, students are increasingly turning toward social media and cross-platform messaging apps such as WhatsApp, Line, and WeChat to support their teaching and learning processes. Although each messaging app has varying features, WhatsApp remains one of the most popular cross-platform apps that allow for fast, simple, secure messaging and free calls anytime, anywhere. With a plethora of advantages, students could easily assimilate WhatsApp as a supportive technology in their learning process. There could be peer to peer learning, and a teacher will be able to share knowledge digitally via the creation of WhatsApp groups. Content analysis techniques were utilized to analyze data collected by closed-ended question forms. Studies demonstrated that 98.8% of college students (n=80) from the Monash University foundation year agreed that the employment of WhatsApp groups was helpful as a learning tool. Approximately 71.3% disagreed that notifications and alerts from the WhatsApp group were disruptions in their studies. Students commented that they could silence the notifications and hence, it would not disturb their flow of thoughts. In fact, an overwhelming majority of students (95.0%) found it enjoyable to participate in WhatsApp groups for educational purposes. It was a common perception that some students felt pressured to post a reply in such groups, but data analysis showed that 72.5% of students did not feel pressured to comment or reply. It was good that 93.8% of students felt satisfactory if their posts were not responded to speedily, but was eventually attended to. Generally, 97.5% of students found it useful if their teachers provided their handphone numbers to be added to a WhatsApp group. If a teacher posts an explanation or a mathematical working in the group, all students would be able to view the post together, as opposed to individual students asking their teacher a similar question. On whether students preferred using Facebook as a learning tool, there was a 50-50 divide in the replies from the respondents as 51.3% of students liked WhatsApp, while 48.8% preferred Facebook as a supportive technology in teaching and learning. Taken altogether, the utilization of WhatsApp groups as a supportive technology in teaching and learning should be implemented in all classes to continuously engage our generation Y students in the ever-changing digital landscape.-

Keywords: education, learning, messaging app, technology, WhatsApp groups

Procedia PDF Downloads 140
1201 Spatio-Temporal Analysis of Land Use Change and Green Cover Index

Authors: Poonam Sharma, Ankur Srivastav

Abstract:

Cities are complex and dynamic systems that constitute a significant challenge to urban planning. The increasing size of the built-up area owing to growing population pressure and economic growth have lead to massive Landuse/Landcover change resulted in the loss of natural habitat and thus reducing the green covers in urban areas. Urban environmental quality is influenced by several aspects, including its geographical configuration, the scale, and nature of human activities occurring and environmental impacts generated. Cities have transformed into complex and dynamic systems that constitute a significant challenge to urban planning. Cities and their sustainability are often discussed together as the cities stand confronted with numerous environmental concerns as the world becoming increasingly urbanized, and the cities are situated in the mesh of global networks in multiple senses. A rapid transformed urban setting plays a crucial role to change the green area of natural habitats. To examine the pattern of urban growth and to measure the Landuse/Landcover change in Gurgoan in Haryana, India through the integration of Geospatial technique is attempted in the research paper. Satellite images are used to measure the spatiotemporal changes that have occurred in the land use and land cover resulting into a new cityscape. It has been observed from the analysis that drastically evident changes in land use has occurred with the massive rise in built up areas and the decrease in green cover and therefore causing the sustainability of the city an important area of concern. The massive increase in built-up area has influenced the localised temperatures and heat concentration. To enhance the decision-making process in urban planning, a detailed and real world depiction of these urban spaces is the need of the hour. Monitoring indicators of key processes in land use and economic development are essential for evaluating policy measures.

Keywords: cityscape, geospatial techniques, green cover index, urban environmental quality, urban planning

Procedia PDF Downloads 248
1200 A Case Study on an Integrated Analysis of Well Control and Blow out Accident

Authors: Yasir Memon

Abstract:

The complexity and challenges in the offshore industry are increasing more than the past. The oil and gas industry is expanding every day by accomplishing these challenges. More challenging wells such as longer and deeper are being drilled in today’s environment. Blowout prevention phenomena hold a worthy importance in oil and gas biosphere. In recent, so many past years when the oil and gas industry was growing drilling operation were extremely dangerous. There was none technology to determine the pressure of reservoir and drilling hence was blind operation. A blowout arises when an uncontrolled reservoir pressure enters in wellbore. A potential of blowout in the oil industry is the danger for the both environment and the human life. Environmental damage, state/country regulators, and the capital investment causes in loss. There are many cases of blowout in the oil the gas industry caused damage to both human and the environment. A huge capital investment is being in used to stop happening of blowout through all over the biosphere to bring damage at the lowest level. The objective of this study is to promote safety and good resources to assure safety and environmental integrity in all operations during drilling. This study shows that human errors and management failure is the main cause of blowout therefore proper management with the wise use of precautions, prevention methods or controlling techniques can reduce the probability of blowout to a minimum level. It also discusses basic procedures, concepts and equipment involved in well control methods and various steps using at various conditions. Furthermore, another aim of this study work is to highlight management role in oil gas operations. Moreover, this study analyze the causes of Blowout of Macondo well occurred in the Gulf of Mexico on April 20, 2010, and deliver the recommendations and analysis of various aspect of well control methods and also provides the list of mistakes and compromises that British Petroleum and its partner were making during drilling and well completion methods and also the Macondo well disaster happened due to various safety and development rules violation. This case study concludes that Macondo well blowout disaster could be avoided with proper management of their personnel’s and communication between them and by following safety rules/laws it could be brought to minimum environmental damage.

Keywords: energy, environment, oil and gas industry, Macondo well accident

Procedia PDF Downloads 163
1199 Assessment of Acute Oral Toxicity Studies and Anti Diabetic Activity of Herbal Mediated Nanomedicine

Authors: Shanker Kalakotla, Krishna Mohan Gottumukkala

Abstract:

Diabetes is a metabolic disorder characterized by hyperglycemia, carbohydrates, altered lipids and proteins metabolism. In recent research nanotechnology is a blazing field for the researchers; latterly there has been prodigious excitement in the nanomedicine and nano pharmacological area for the study of silver nanoparticles synthesis using natural products. Biological methods have been used to synthesize silver nanoparticles in presence of medicinally active antidiabetic plants, and this intention made us assess the biologically synthesized silver nanoparticles from the seed extract of Psoralea corylfolia using 1 mM silver nitrate solution. The synthesized herbal mediated silver nanoparticles (HMSNP’s) then subjected to various characterization techniques such as XRD, SEM, EDX, TEM, DLS, UV and FT-IR respectively. In current study, the silver nanoparticles tested for in-vitro anti-diabetic activity and possible toxic effects in healthy female albino mice by following OECD guidelines-425. Herbal mediated silver nanoparticles were successfully obtained from bioreduction of silver nitrate using Psoralea corylifolia plant extract. Silver nanoparticles have been appropriately characterized and confirmed using different types of equipment viz., UV-vis spectroscopy, XRD, FTIR, DLS, SEM and EDX analysis. From the behavioral observations of the study, the female albino mice did not show sedation, respiratory arrest, and convulsions. Test compounds did not cause any mortality at the dose level tested (i.e., 2000 mg/kg body weight) doses till the end of 14 days of observation and were considered safe. It may be concluded that LD50 of the HMSNPs was 2000mg/kg body weight. Since LD50 of the HMSNPs was 2000mg/kg body weight, so the preferred dose range for HMSNPs falls between the levels of 200 and 400 mg/kg. Further In-vivo pharmacological models and biochemical investigations will clearly elucidate the mechanism of action and will be helpful in projecting the currently synthesized silver nanoparticles as a therapeutic target in treating chronic ailments.

Keywords: herbal mediated silver nanoparticles, HMSNPs, toxicity of silver nanoparticles, PTP1B in-vitro anti-diabetic assay female albino mice, 425 OECD guidelines

Procedia PDF Downloads 256
1198 A Systematic Review of the Psychometric Properties of Augmentative and Alternative Communication Assessment Tools in Adolescents with Complex Communication Needs

Authors: Nadwah Onwi, Puspa Maniam, Azmawanie A. Aziz, Fairus Mukhtar, Nor Azrita Mohamed Zin, Nurul Haslina Mohd Zin, Nurul Fatehah Ismail, Mohamad Safwan Yusoff, Susilidianamanalu Abd Rahman, Siti Munirah Harris, Maryam Aizuddin

Abstract:

Objective: Malaysia has a growing number of individuals with complex communication needs (CCN). The initiation of augmentative and alternative communication (AAC) intervention may facilitate individuals with CCN to understand and express themselves optimally and actively participate in activities in their daily life. AAC is defined as multimodal use of communication ability to allow individuals to use every mode possible to communicate with others using a set of symbols or systems that may include the symbols, aids, techniques, and strategies. It is consequently critical to evaluate the deficits to inform treatment for AAC intervention. However, no known measurement tools are available to evaluate the user with CCN available locally. Design: A systematic review (SR) is designed to analyze the psychometric properties of AAC assessment for adolescents with CCN published in peer-reviewed journals. Tools are rated by the methodological quality of studies and the psychometric measurement qualities of each tool. Method: A literature search identifying AAC assessment tools with psychometrically robust properties and conceptual framework was considered. Two independent reviewers screened the abstracts and full-text articles and review bibliographies for further references. Data were extracted using standardized forms and study risk of bias was assessed. Result: The review highlights the psychometric properties of AAC assessment tools that can be used by speech-language therapists applicable to be used in the Malaysian context. The work outlines how systematic review methods may be applied to the consideration of published material that provides valuable data to initiate the development of Malay Language AAC assessment tools. Conclusion: The synthesis of evidence has provided a framework for Malaysia Speech-Language therapists in making an informed decision for AAC intervention in our standard operating procedure in the Ministry of Health, Malaysia.

Keywords: augmentative and alternative communication, assessment, adolescents, complex communication needs

Procedia PDF Downloads 132
1197 Lead Chalcogenide Quantum Dots for Use in Radiation Detectors

Authors: Tom Nakotte, Hongmei Luo

Abstract:

Lead chalcogenide-based (PbS, PbSe, and PbTe) quantum dots (QDs) were synthesized for the purpose of implementing them in radiation detectors. Pb based materials have long been of interest for gamma and x-ray detection due to its high absorption cross section and Z number. The emphasis of the studies was on exploring how to control charge carrier transport within thin films containing the QDs. The properties of QDs itself can be altered by changing the size, shape, composition, and surface chemistry of the dots, while the properties of carrier transport within QD films are affected by post-deposition treatment of the films. The QDs were synthesized using colloidal synthesis methods and films were grown using multiple film coating techniques, such as spin coating and doctor blading. Current QD radiation detectors are based on the QD acting as fluorophores in a scintillation detector. Here the viability of using QDs in solid-state radiation detectors, for which the incident detectable radiation causes a direct electronic response within the QD film is explored. Achieving high sensitivity and accurate energy quantification in QD radiation detectors requires a large carrier mobility and diffusion lengths in the QD films. Pb chalcogenides-based QDs were synthesized with both traditional oleic acid ligands as well as more weakly binding oleylamine ligands, allowing for in-solution ligand exchange making the deposition of thick films in a single step possible. The PbS and PbSe QDs showed better air stability than PbTe. After precipitation the QDs passivated with the shorter ligand are dispersed in 2,6-difloupyridine resulting in colloidal solutions with concentrations anywhere from 10-100 mg/mL for film processing applications, More concentrated colloidal solutions produce thicker films during spin-coating, while an extremely concentrated solution (100 mg/mL) can be used to produce several micrometer thick films using doctor blading. Film thicknesses of micrometer or even millimeters are needed for radiation detector for high-energy gamma rays, which are of interest for astrophysics or nuclear security, in order to provide sufficient stopping power.

Keywords: colloidal synthesis, lead chalcogenide, radiation detectors, quantum dots

Procedia PDF Downloads 111
1196 A Risk Assessment Tool for the Contamination of Aflatoxins on Dried Figs Based on Machine Learning Algorithms

Authors: Kottaridi Klimentia, Demopoulos Vasilis, Sidiropoulos Anastasios, Ihara Diego, Nikolaidis Vasileios, Antonopoulos Dimitrios

Abstract:

Aflatoxins are highly poisonous and carcinogenic compounds produced by species of the genus Aspergillus spp. that can infect a variety of agricultural foods, including dried figs. Biological and environmental factors, such as population, pathogenicity, and aflatoxinogenic capacity of the strains, topography, soil, and climate parameters of the fig orchards, are believed to have a strong effect on aflatoxin levels. Existing methods for aflatoxin detection and measurement, such as high performance liquid chromatography (HPLC), and enzyme-linked immunosorbent assay (ELISA), can provide accurate results, but the procedures are usually time-consuming, sample-destructive, and expensive. Predicting aflatoxin levels prior to crop harvest is useful for minimizing the health and financial impact of a contaminated crop. Consequently, there is interest in developing a tool that predicts aflatoxin levels based on topography and soil analysis data of fig orchards. This paper describes the development of a risk assessment tool for the contamination of aflatoxin on dried figs, based on the location and altitude of the fig orchards, the population of the fungus Aspergillus spp. in the soil, and soil parameters such as pH, saturation percentage (SP), electrical conductivity (EC), organic matter, particle size analysis (sand, silt, clay), the concentration of the exchangeable cations (Ca, Mg, K, Na), extractable P, and trace of elements (B, Fe, Mn, Zn and Cu), by employing machine learning methods. In particular, our proposed method integrates three machine learning techniques, i.e., dimensionality reduction on the original dataset (principal component analysis), metric learning (Mahalanobis metric for clustering), and k-nearest neighbors learning algorithm (KNN), into an enhanced model, with mean performance equal to 85% by terms of the Pearson correlation coefficient (PCC) between observed and predicted values.

Keywords: aflatoxins, Aspergillus spp., dried figs, k-nearest neighbors, machine learning, prediction

Procedia PDF Downloads 159
1195 Atomic Scale Storage Mechanism Study of the Advanced Anode Materials for Lithium-Ion Batteries

Authors: Xi Wang, Yoshio Bando

Abstract:

Lithium-ion batteries (LIBs) can deliver high levels of energy storage density and offer long operating lifetimes, but their power density is too low for many important applications. Therefore, we developed some new strategies and fabricated novel electrodes for fast Li transport and its facile synthesis including N-doped graphene-SnO2 sandwich papers, bicontinuous nanoporous Cu/Li4Ti5O12 electrode, and binder-free N-doped graphene papers. In addition, by using advanced in-TEM, STEM techniques and the theoretical simulations, we systematically studied and understood their storage mechanisms at the atomic scale, which shed a new light on the reasons of the ultrafast lithium storage property and high capacity for these advanced anodes. For example, by using advanced in-situ TEM, we directly investigated these processes using an individual CuO nanowire anode and constructed a LIB prototype within a TEM. Being promising candidates for anodes in lithium-ion batteries (LIBs), transition metal oxide anodes utilizing the so-called conversion mechanism principle typically suffer from the severe capacity fading during the 1st cycle of lithiation–delithiation. Also we report on the atomistic insights of the GN energy storage as revealed by in situ TEM. The lithiation process on edges and basal planes is directly visualized, the pyrrolic N "hole" defect and the perturbed solid-electrolyte-interface (SEI) configurations are observed, and charge transfer states for three N-existing forms are also investigated. In situ HRTEM experiments together with theoretical calculations provide a solid evidence that enlarged edge {0001} spacings and surface "hole" defects result in improved surface capacitive effects and thus high rate capability and the high capacity is owing to short-distance orderings at the edges during discharging and numerous surface defects; the phenomena cannot be understood previously by standard electron or X-ray diffraction analyses.

Keywords: in-situ TEM, STEM, advanced anode, lithium-ion batteries, storage mechanism

Procedia PDF Downloads 334
1194 External Business Environment and Sustainability of Micro, Small and Medium Enterprises in Jigawa State, Nigeria

Authors: Shehu Isyaku

Abstract:

The general objective of the study was to investigate ‘the relationship between the external business environment and the sustainability of micro, small and medium enterprises (MSMEs) in Jigawa state’, Nigeria. Specifically, the study was to examine the relationship between 1) the economic environment, 2) the social environment, 3) the technological environment, and 4) the political environment and the sustainability of MSMEs in Jigawa state, Nigeria. The study was drawn on Resource-Based View (RBV) Theory and Knowledge-Based View (KBV). The study employed a descriptive cross-sectional survey design. A researcher-made questionnaire was used to collect data from the 350 managers/owners who were selected using stratified, purposive and simple random sampling techniques. Data analysis was done using means and standard deviations, factor analysis, Correlation Coefficient, and Pearson Linear Regression analysis. The findings of the study revealed that the sustainability potentials of the managers/owners were rated as high potential (economic, environmental, and social sustainability using 5 5-point Likert scale. Mean ratings of effectiveness of the external business environment were; as highly effective. The results from the Pearson Linear Regression Analysis rejected the hypothesized non-significant effect of the external business environment on the sustainability of MSMEs. Specifically, there is a positive significant relationship between 1) economic environment and sustainability; 2) social environment and sustainability; 3) technological environment and sustainability and political environment and sustainability. The researcher concluded that MSME managers/owners have a high potential for economic, social and environmental sustainability and that all the constructs of the external business environment (economic environment, social environment, technological environment and political environment) have a positive significant relationship with the sustainability of MSMEs. Finally, the researcher recommended that 1) MSME managers/owners need to develop marketing strategies and intelligence systems to accumulate information about the competitors and customers' demands, 2) managers/owners should utilize the customers’ cultural and religious beliefs as an opportunity that should be utilized while formulating business strategies.

Keywords: business environment, sustainability, small and medium enterprises, external business environment

Procedia PDF Downloads 25
1193 Factors Militating the Organization of Intramural Sport Programs in Secondary Schools: A Case Study of the Ekiti West Local Government Area of Ekiti State, Nigeria

Authors: Adewole Taiwo Adelabu

Abstract:

The study investigated the factors militating the organization of intramural sports programs in secondary schools in Ekiti State, Nigeria. The purpose of the study was to identify the factors affecting the organization of sports in secondary schools and also to proffer possible solutions to these factors. The study employed the inferential statistics of chi-square (x2). Five research hypotheses were formulated. The population for the study was all the students in the government-owned secondary schools in Ekiti West Local Government of Ekiti State Nigeria. The sample for the study was 60 students in three schools within the local government selected through simple random sampling techniques. The instrument used for the study was a self-developed questionnaire by the researcher for data collection. The instrument was presented to experts and academicians in the field of Human Kinetics and Health Education for construct and content validation. A reliability test was conducted which involves 10 students who are not part of the study. The test-retest coefficient of 0.74 was obtained which attested to the fact that the instrument was reliable enough for the study. The validated questionnaire was administered to the students in their various schools by the researcher with the help of two research assistants; the questionnaires were filled and returned to the researcher immediately. The data collected were analyzed using the descriptive statistics of frequency count, percentage and mean to analyze demographic data in section A of the questionnaire, while inferential statistics of chi-square was used to test the hypotheses at 0.05 alpha level. The results of the study revealed that personnel, fund, schedule (time) were significant factors that affect the organization of intramural sport programs among students in secondary schools in Ekiti West Local Government Area of the State. The study also revealed that organization of intramural sports programs among students of secondary schools will improve and motivate students’ participation in sports beyond the local level. However, facilities and equipment is not a significant factor affecting the organization of intramural sports among secondary school students in Ekiti West Local Government Area.

Keywords: challenge, intramural sport, militating, programmes

Procedia PDF Downloads 125
1192 Creating and Questioning Research-Oriented Digital Outputs to Manuscript Metadata: A Case-Based Methodological Investigation

Authors: Diandra Cristache

Abstract:

The transition of traditional manuscript studies into the digital framework closely affects the methodological premises upon which manuscript descriptions are modeled, created, and questioned for the purpose of research. This paper intends to explore the issue by presenting a methodological investigation into the process of modeling, creating, and questioning manuscript metadata. The investigation is founded on a close observation of the Polonsky Greek Manuscripts Project, a collaboration between the Universities of Cambridge and Heidelberg. More than just providing a realistic ground for methodological exploration, along with a complete metadata set for computational demonstration, the case study also contributes to a broader purpose: outlining general methodological principles for making the most out of manuscript metadata by means of research-oriented digital outputs. The analysis mainly focuses on the scholarly approach to manuscript descriptions, in the specific instance where the act of metadata recording does not have a programmatic research purpose. Close attention is paid to the encounter of 'traditional' practices in manuscript studies with the formal constraints of the digital framework: does the shift in practices (especially from the straight narrative of free writing towards the hierarchical constraints of the TEI encoding model) impact the structure of metadata and its capability to respond specific research questions? It is argued that flexible structure of TEI and traditional approaches to manuscript description lead to a proliferation of markup: does an 'encyclopedic' descriptive approach ensure the epistemological relevance of the digital outputs to metadata? To provide further insight on the computational approach to manuscript metadata, the metadata of the Polonsky project are processed with techniques of distant reading and data networking, thus resulting in a new group of digital outputs (relational graphs, geographic maps). The computational process and the digital outputs are thoroughly illustrated and discussed. Eventually, a retrospective analysis evaluates how the digital outputs respond to the scientific expectations of research, and the other way round, how the requirements of research questions feed back into the creation and enrichment of metadata in an iterative loop.

Keywords: digital manuscript studies, digital outputs to manuscripts metadata, metadata interoperability, methodological issues

Procedia PDF Downloads 120
1191 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data

Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin

Abstract:

The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.

Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline

Procedia PDF Downloads 292
1190 Adaptive Process Monitoring for Time-Varying Situations Using Statistical Learning Algorithms

Authors: Seulki Lee, Seoung Bum Kim

Abstract:

Statistical process control (SPC) is a practical and effective method for quality control. The most important and widely used technique in SPC is a control chart. The main goal of a control chart is to detect any assignable changes that affect the quality output. Most conventional control charts, such as Hotelling’s T2 charts, are commonly based on the assumption that the quality characteristics follow a multivariate normal distribution. However, in modern complicated manufacturing systems, appropriate control chart techniques that can efficiently handle the nonnormal processes are required. To overcome the shortcomings of conventional control charts for nonnormal processes, several methods have been proposed to combine statistical learning algorithms and multivariate control charts. Statistical learning-based control charts, such as support vector data description (SVDD)-based charts, k-nearest neighbors-based charts, have proven their improved performance in nonnormal situations compared to that of the T2 chart. Beside the nonnormal property, time-varying operations are also quite common in real manufacturing fields because of various factors such as product and set-point changes, seasonal variations, catalyst degradation, and sensor drifting. However, traditional control charts cannot accommodate future condition changes of the process because they are formulated based on the data information recorded in the early stage of the process. In the present paper, we propose a SVDD algorithm-based control chart, which is capable of adaptively monitoring time-varying and nonnormal processes. We reformulated the SVDD algorithm into a time-adaptive SVDD algorithm by adding a weighting factor that reflects time-varying situations. Moreover, we defined the updating region for the efficient model-updating structure of the control chart. The proposed control chart simultaneously allows efficient model updates and timely detection of out-of-control signals. The effectiveness and applicability of the proposed chart were demonstrated through experiments with the simulated data and the real data from the metal frame process in mobile device manufacturing.

Keywords: multivariate control chart, nonparametric method, support vector data description, time-varying process

Procedia PDF Downloads 278
1189 Mutations in rpoB, katG and inhA Genes: The Association with Resistance to Rifampicin and Isoniazid in Egyptian Mycobacterium tuberculosis Clinical Isolates

Authors: Ayman K. El Essawy, Amal M. Hosny, Hala M. Abu Shady

Abstract:

The rapid detection of TB and drug resistance, both optimizes treatment and improves outcomes. In the current study, respiratory specimens were collected from 155 patients. Conventional susceptibility testing and MIC determination were performed for rifampicin (RIF) and isoniazid (INH). Genotype MTBDRplus assay, which is a molecular genetic assay based on the DNA-STRIP technology and specific gene sequencing with primers for rpoB, KatG, and mab-inhA genes were used to detect mutations associated with resistance to rifampicin and isoniazid. In comparison to other categories, most of rifampicin resistant (61.5%) and isoniazid resistant isolates (47.1%) were from patients relapsed in treatment. The genotypic profile (using Genotype MTBDRplus assay) of multi-drug resistant (MDR) isolates showed missing of katG wild type 1 (WT1) band and appearance of mutation band katG MUT2. For isoniazid mono-resistant isolates, 80% showed katG MUT1, 20% showed katG MUT1, and inhA MUT1, 20% showed only inhA MUT1. Accordingly, 100% of isoniazid resistant strains were detected by this assay. Out of 17 resistant strains, 16 had mutation bands for katG distinguished high resistance to isoniazid. The assay could clearly detect rifampicin resistance among 66.7% of MDR isolates that showed mutation band rpoB MUT3 while 33.3% of them were considered as unknown. One mono-resistant rifampicin isolate did not show rifampicin mutation bands by Genotype MTBDRplus assay, but it showed an unexpected mutation in Codon 531 of rpoB by DNA sequence analysis. Rifampicin resistance in this strain could be associated with a mutation in codon 531 of rpoB (based on molecular sequencing), and Genotype MTBDRplus assay could not detect the associated mutation. If the results of Genotype MTBDRplus assay and sequencing were combined, this strain shows hetero-resistance pattern. Gene sequencing of eight selected isolates, previously tested by Genotype MTBDRplus assay, could detect resistance mutations mainly in codon 315 (katG gene), position -15 in inhA promotes gene for isoniazid resistance and codon 531 (rpoB gene) for rifampicin resistance. Genotyping techniques allow distinguishing between recurrent cases of reinfection or reactivation and supports epidemiological studies.

Keywords: M. tuberculosis, rpoB, KatG, inhA, genotype MTBDRplus

Procedia PDF Downloads 138
1188 Finite Element Analysis of Mechanical Properties of Additively Manufactured 17-4 PH Stainless Steel

Authors: Bijit Kalita, R. Jayaganthan

Abstract:

Additive manufacturing (AM) is a novel manufacturing method which provides more freedom in design, manufacturing near-net-shaped parts as per demand, lower cost of production, and expedition in delivery time to market. Among various metals, AM techniques, Laser Powder Bed Fusion (L-PBF) is the most prominent one that provides higher accuracy and powder proficiency in comparison to other methods. Particularly, 17-4 PH alloy is martensitic precipitation hardened (PH) stainless steel characterized by resistance to corrosion up to 300°C and tailorable strengthening by copper precipitates. Additively manufactured 17-4 PH stainless steel exhibited a dendritic/cellular solidification microstructure in the as-built condition. It is widely used as a structural material in marine environments, power plants, aerospace, and chemical industries. The excellent weldability of 17-4 PH stainless steel and its ability to be heat treated to improve mechanical properties make it a good material choice for L-PBF. In this study, the microstructures of martensitic stainless steels in the as-built state, as well as the effects of process parameters, building atmosphere, and heat treatments on the microstructures, are reviewed. Mechanical properties of fabricated parts are studied through micro-hardness and tensile tests. Tensile tests are carried out under different strain rates at room temperature. In addition, the effect of process parameters and heat treatment conditions on mechanical properties is critically reviewed. These studies revealed the performance of L-PBF fabricated 17–4 PH stainless-steel parts under cyclic loading, and the results indicated that fatigue properties were more sensitive to the defects generated by L-PBF (e.g., porosity, microcracks), leading to the low fracture strains and stresses under cyclic loading. Rapid melting, solidification, and re-melting of powders during the process and different combinations of processing parameters result in a complex thermal history and heterogeneous microstructure and are necessary to better control the microstructures and properties of L-PBF PH stainless steels through high-efficiency and low-cost heat treatments.

Keywords: 17–4 PH stainless steel, laser powder bed fusion, selective laser melting, microstructure, additive manufacturing

Procedia PDF Downloads 103
1187 Airborne Particulate Matter Passive Samplers for Indoor and Outdoor Exposure Monitoring: Development and Evaluation

Authors: Kholoud Abdulaziz, Kholoud Al-Najdi, Abdullah Kadri, Konstantinos E. Kakosimos

Abstract:

The Middle East area is highly affected by air pollution induced by anthropogenic and natural phenomena. There is evidence that air pollution, especially particulates, greatly affects the population health. Many studies have raised a warning of the high concentration of particulates and their affect not just around industrial and construction areas but also in the immediate working and living environment. One of the methods to study air quality is continuous and periodic monitoring using active or passive samplers. Active monitoring and sampling are the default procedures per the European and US standards. However, in many cases they have been inefficient to accurately capture the spatial variability of air pollution due to the small number of installations; which eventually is attributed to the high cost of the equipment and the limited availability of users with expertise and scientific background. Another alternative has been found to account for the limitations of the active methods that is the passive sampling. It is inexpensive, requires no continuous power supply, and easy to assemble which makes it a more flexible option, though less accurate. This study aims to investigate and evaluate the use of passive sampling for particulate matter pollution monitoring in dry tropical climates, like in the Middle East. More specifically, a number of field measurements have be conducted, both indoors and outdoors, at Qatar and the results have been compared with active sampling equipment and the reference methods. The samples have been analyzed, that is to obtain particle size distribution, by applying existing laboratory techniques (optical microscopy) and by exploring new approaches like the white light interferometry to. Then the new parameters of the well-established model have been calculated in order to estimate the atmospheric concentration of particulates. Additionally, an extended literature review will investigate for new and better models. The outcome of this project is expected to have an impact on the public, as well, as it will raise awareness among people about the quality of life and about the importance of implementing research culture in the community.

Keywords: air pollution, passive samplers, interferometry, indoor, outdoor

Procedia PDF Downloads 378
1186 Effectiveness of Research Promotion Organizations in Higher Education and Research (ESR)

Authors: Jonas Sanon

Abstract:

The valorization of research is becoming a transversal instrument linking different sectors (academic, public and industrial). The practice of valorization seems to impact innovation techniques within companies where, there is often the implementation of industrial conventions of training through research (CIFRE), continuous training programs for employees, collaborations and partnerships around joint research and R&D laboratories focused on the needs of companies to improve or develop more efficient innovations. Furthermore, many public initiatives to support innovation and technology transfer have been developed at the international, European and national levels, with significant budget allocations. Thus, in the context of this work, we tried to analyze the way in which research transfer structures are evaluated within the Saclay ecosystem. In fact, the University-Paris-Saclay is one of the best French universities; it is made up of 10 university components, more than 275 laboratories and is in partnership with the largest French research centers This work mainly focused on how evaluations affected research transfer structures, how evaluations were conducted, and what the managers of research transfer structures thought about assessments. Thus, with the aid of the conducted interviews, it appears that the evaluations do not have a significant impact on the qualitative aspect of research and innovation, but is rather present a directive aspect to allow the structures to benefit or not from the financial resources to develop certain research work, sometimes directed and influenced by the market, some researchers might try to accentuate their research and experimentation work on themes that are not necessarily their areas of interest, but just to comply with the calls for proposed thematic projects. The field studies also outline the primary indicators used to assess the effectiveness of valorization structures as "the number of start-ups generated, the license agreements signed, the structure's patent portfolio, and the innovations of items developed from public research.". Finally, after mapping the actors, it became clear that the ecosystem of the University of Paris-Saclay benefits from a richness allowing it to better value its research in relation to the three categories of actors it has (internal, external and transversal), united and linked by a relationship of proximity of sharing and endowed with a real opportunity to innovate openly.

Keywords: research valorization, technology transfer, innovation, evaluation, impacts and performances, innovation policy

Procedia PDF Downloads 47
1185 Effect of Dose-Dependent Gamma Irradiation on the Fatty Acid Profile of Mud Crab, Scylla Serrata: A GC-FID Study

Authors: Keethadath Arshad, Kappalli Sudha

Abstract:

Mud crab, Scylla Serrata, a commercially important shellfish with high global demand appears to be the rich source of dietary fatty acids. Its increased production through aquaculture and highly perishable nature would necessitate improved techniques for their proper preservation. Optimized irradiation has been identified as an effective method to facilitate safety and extended shelf life for a broad range of the perishable food items including finfishes and shellfishes. The present study analyzed the effects of dose-dependent gamma irradiation on the fatty acid profile of the muscle derived from the candidate species (S. serrata) at both qualitative and quantitative levels. Wild grown, average sized, intermolt male S. Serrata were gamma irradiated (^60C, 3.8kGy/ hour) at the dosage of 0.5kGy, 1.0kGy and 2.0kGy using gamma chamber. Total lipid extracted by Folch method, after methylation, were analyzed for the presence fatty acids adopting Gas Chromatograph equipped with flame ionization detector by comparing with the authentic FAME reference standards. The tissue from non-irradiated S. serrata showed the presence of 12 SFA, 6 MUFA, 8PUFA and 2 TF; PUFA includes medicinally important ω-3 FA such as C18:3, C20:5 and C22:6 and ω-6 FA such as γ- C18:3 and C20:2. Dose-dependent gamma irradiation reduced the number of detectable fatty acids (10, 8 and 8 SFA, 6, 6 and 5MUFA, 7, 7, and 6 PUFA and 1, 1, and 0 TF in 0.5kGy, 1.0kGy and 2kGy irradiated samples respectively). Major fatty acids detected in both irradiated and non-irradiated samples were as follows: SFA- C16:0, C18:0, C22:0 and C14:0; MUFA - C18:1 and C16:1and PUFA- C18:2, C20:5, C20:2 and C22:6. Irradiation doses ranging from 1-2kGy substantially reduced the ω-6 C18:3 and ω-3 C18:3. However, the omega fatty acids such as C20:5, C22:6 and C20:2 could survive even after 2kGy irradiation. Significantly, trans fat like C18:2T and C18:1T were completely disappeared upon 2kGy irradiation. From the overall observations made from the present study, it is suggested that irradiation dose up to 1kGy is optimum to maintain the fatty acid profile and eradicate the trans fat of the muscle derived from S. serrata.

Keywords: fatty acid profile, food preservation, gamma irradiation, scylla serrata

Procedia PDF Downloads 252
1184 Electrochemical Impedance Spectroscopy Based Label-Free Detection of TSG101 by Electric Field Lysis of Immobilized Exosomes from Human Serum

Authors: Nusrat Praween, Krishna Thej Pammi Guru, Palash Kumar Basu

Abstract:

Designing non-invasive biosensors for cancer diagnosis is essential for developing an affordable and specific tool to measure cancer-related exosome biomarkers. Exosomes, released by healthy as well as cancer cells, contain valuable information about the biomarkers of various diseases, including cancer. Despite the availability of various isolation techniques, ultracentrifugation is the standard technique that is being employed. Post isolation, exosomes are traditionally exposed to detergents for extracting their proteins, which can often lead to protein degradation. Further to this, it is very essential to develop a sensing platform for the quantification of clinically relevant proteins in a wider range to ensure practicality. In this study, exosomes were immobilized on the Au Screen Printed Electrode (SPE) using EDC/NHS chemistry to facilitate binding. After immobilizing the exosomes on the screen-printed electrode (SPE), we investigated the impact of the electric field by applying various voltages to induce exosome lysis and release their contents. The lysed solution was used for sensing TSG101, a crucial biomarker associated with various cancers, using both faradaic and non-faradaic electrochemical impedance spectroscopy (EIS) methods. The results of non-faradaic and faradaic EIS were comparable and showed good consistency, indicating that non-faradaic sensing can be a reliable alternative. Hence, the non-faradaic sensing technique was used for label-free quantification of the TSG101 biomarker. The results were validated using ELISA. Our electrochemical immunosensor demonstrated a consistent response of TSG101 from 125 pg/mL to 8000 pg/mL, with a detection limit of 0.125 pg/mL at room temperature. Additionally, since non-faradic sensing is label-free, the ease of usage and cost of the final sensor developed can be reduced. The proposed immunosensor is capable of detecting the TSG101 protein at low levels in healthy serum with good sensitivity and specificity, making it a promising platform for biomarker detection.

Keywords: biosensor, exosomes isolation on SPE, electric field lysis of exosome, EIS sensing of TSG101

Procedia PDF Downloads 19
1183 Genotyping and Phylogeny of Phaeomoniella Genus Associated with Grapevine Trunk Diseases in Algeria

Authors: A. Berraf-Tebbal, Z. Bouznad, , A.J.L. Phillips

Abstract:

Phaeomoniella is a fungus genus in the mitosporic ascomycota which includes Phaeomoniella chlamydospora specie associated with two declining diseases on grapevine (Vitis vinifera) namely Petri disease and esca. Recent studies have shown that several Phaeomoniella species also cause disease on many other woody crops, such as forest trees and woody ornamentals. Two new species, Phaeomoniella zymoides and Phaeomoniella pinifoliorum H.B. Lee, J.Y. Park, R.C. Summerbell et H.S. Jung, were isolated from the needle surface of Pinus densiflora Sieb. et Zucc. in Korea. The identification of species in Phaeomoniella genus can be a difficult task if based solely on morphological and cultural characters. In this respect, the application of molecular methods, particularly PCR-based techniques, may provide an important contribution. MSP-PCR (microsatellite primed-PCR) fingerprinting has proven useful in the molecular typing of fungal strains. The high discriminatory potential of this method is particularly useful when dealing with closely related or cryptic species. In the present study, the application of PCR fingerprinting was performed using the micro satellite primer M13 for the purpose of species identification and strain typing of 84 Phaeomoniella -like isolates collected from grapevines with typical symptoms of dieback. The bands produced by MSP-PCR profiles divided the strains into 3 clusters and 5 singletons with a reproducibility level of 80%. Representative isolates from each group and, when possible, isolates from Eutypa dieback and esca symptoms were selected for sequencing of the ITS region. The ITS sequences for the 16 isolates selected from the MSP-PCR profiles were combined and aligned with sequences of 18 isolates retrieved from GenBank, representing a selection of all known Phaeomoniella species. DNA sequences were compared with those available in GenBank using Neighbor-joining (NJ) and Maximum-parsimony (MP) analyses. The phylogenetic trees of the ITS region revealed that the Phaeomoniella isolates clustered with Phaeomoniella chlamydospora reference sequences with a bootstrap support of 100 %. The complexity of the pathosystems vine-trunk diseases shows clearly the need to identify unambiguously the fungal component in order to allow a better understanding of the etiology of these diseases and justify the establishment of control strategies against these fungal agents.

Keywords: Genotyping, MSP-PCR, ITS, phylogeny, trunk diseases

Procedia PDF Downloads 464
1182 Predictors of Clinical Failure After Endoscopic Lumbar Spine Surgery During the Initial Learning Curve

Authors: Daniel Scherman, Daniel Madani, Shanu Gambhir, Marcus Ling Zhixing, Yingda Li

Abstract:

Objective: This study aims to identify clinical factors that may predict failed endoscopic lumbar spine surgery to guide surgeons with patient selection during the initial learning curve. Methods: This is an Australasian prospective analysis of the first 105 patients to undergo lumbar endoscopic spine decompression by 3 surgeons. Modified MacNab outcomes, Oswestry Disability Index (ODI) and Visual Analogue Score (VAS) scores were utilized to evaluate clinical outcomes at 6 months postoperatively. Descriptive statistics and Anova t-tests were performed to measure statistically significant (p<0.05) associations between variables using GraphPad Prism v10. Results: Patients undergoing endoscopic lumbar surgery via an interlaminar or transforaminal approach have overall good/excellent modified MacNab outcomes and a significant reduction in post-operative VAS and ODI scores. Regardless of the anatomical location of disc herniations, good/excellent modified MacNab outcomes and significant reductions in VAS and ODI were reported post-operatively; however, not in patients with calcified disc herniations. Patients with central and foraminal stenosis overall reported poor/fair modified MacNab outcomes. However, there were significant reductions in VAS and ODI scores post-operatively. Patients with subarticular stenosis or an associated spondylolisthesis reported good/excellent modified MacNab outcomes and significant reductions in VAS and ODI scores post-operatively. Patients with disc herniation and concurrent degenerative stenosis had generally poor/fair modified MacNab outcomes. Conclusion: The outcomes of endoscopic spine surgery are encouraging, with a low complication and reoperation rate. However, patients with calcified disc herniations, central canal stenosis or a disc herniation with concurrent degenerative stenosis present challenges during the initial learning curve and may benefit from traditional open or other minimally invasive techniques.

Keywords: complications, lumbar disc herniation, lumbar endoscopic spine surgery, predictors of failed endoscopic spine surgery

Procedia PDF Downloads 116
1181 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices

Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays

Abstract:

Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.

Keywords: ecological momentary assessment, real-time, stress, work

Procedia PDF Downloads 136
1180 The Challenges of Digital Crime Nowadays

Authors: Bendes Ákos

Abstract:

Digital evidence will be the most widely used type of evidence in the future. With the development of the modern world, more and more new types of crimes have evolved and transformed. For this reason, it is extremely important to examine these types of crimes in order to get a comprehensive picture of them, with which we can help the authorities work. In 1865, with early technologies, people were able to forge a picture of a quality that is not even recognized today. With the help of today's technology, authorities receive a lot of false evidence. Officials are not able to process such a large amount of data, nor do they have the necessary technical knowledge to get a real picture of the authenticity of the given evidence. The digital world has many dangers. Unfortunately, we live in an age where we must protect everything digitally: our phones, our computers, our cars, and all the smart devices that are present in our personal lives and this is not only a burden on us, since companies, state and public utilities institutions are also forced to do so. The training of specialists and experts is essential so that the authorities can manage the incoming digital evidence at some level. When analyzing evidence, it is important to be able to examine it from the moment it is created. Establishing authenticity is a very important issue during official procedures. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. Otherwise, they will not have sufficient probative value and in case of doubt, the court will always decide in favor of the defendant. One of the most common problems in the world of digital data and evidence is doubt, which is why it is extremely important to examine the above-mentioned problems. The most effective way to avoid digital crimes is to prevent them, for which proper education and knowledge are essential. The aim is to present the dangers inherent in the digital world and the new types of digital crimes. After the comparison of the Hungarian investigative techniques with international practice, modernizing proposals will be given. A sufficiently stable yet flexible legislation is needed that can monitor the rapid changes in the world and not regulate afterward but rather provide an appropriate framework. It is also important to be able to distinguish between digital and digitalized evidence, as the degree of probative force differs greatly. The aim of the research is to promote effective international cooperation and uniform legal regulation in the world of digital crimes.

Keywords: digital crime, digital law, cyber crime, international cooperation, new crimes, skepticism

Procedia PDF Downloads 44
1179 Expectation for Professionalism Effects Reality Shock: A Qualitative And Quantitative Study of Reality Shock among New Human Service Professionals

Authors: Hiromi Takafuji

Abstract:

It is a well-known fact that health care and welfare are the foundation of human activities, and human service professionals such as nurses and child care workers support these activities. COVID-19 pandemic has made the severity of the working environment in these fields even more known. It is high time to discuss the work of human service workers for the sustainable development of the human environment. Early turnover has been recognized as a long-standing issue in these fields. In Japan, the attrition rate within three years of graduation for these occupations has remained high at about 40% for more than 20 years. One of the reasons for this is Reality Shock: RS, which refers to the stress caused by the gap between pre-employment expectations and the post-employment reality experienced by new workers. The purpose of this study was to academically elucidate the mechanism of RS among human service professionals and to contribute to countermeasures against it. Firstly, to explore the structure of the relationship between professionalism and workers' RS, an exploratory interview survey was conducted and analyzed by text mining and content analysis. The results showed that the expectation of professionalism influences RS as a pre-employment job expectation. Next, the expectations of professionalism were quantified and categorized, and the responses of a total of 282 human service work professionals, nurses, child care workers, and caregivers; were finalized for data analysis. The data were analyzed using exploratory factor analysis, confirmatory factor analysis, multiple regression analysis, and structural equation modeling techniques. The results revealed that self-control orientation and authority orientation by qualification had a direct positive significant impact on RS. On the other hand, interpersonal helping orientation and altruistic orientation were found to have a direct negative significant impact and an indirect positive significant impact on RS.; we were able to clarify the structure of work expectations that affect the RS of welfare professionals, which had not been clarified in previous studies. We also explained the limitations, practical implications, and directions for future research.

Keywords: human service professional, new hire turnover, SEM, reality shock

Procedia PDF Downloads 84
1178 Revealing Single Crystal Quality by Insight Diffraction Imaging Technique

Authors: Thu Nhi Tran Caliste

Abstract:

X-ray Bragg diffraction imaging (“topography”)entered into practical use when Lang designed an “easy” technical setup to characterise the defects / distortions in the high perfection crystals produced for the microelectronics industry. The use of this technique extended to all kind of high quality crystals, and deposited layers, and a series of publications explained, starting from the dynamical theory of diffraction, the contrast of the images of the defects. A quantitative version of “monochromatic topography” known as“Rocking Curve Imaging” (RCI) was implemented, by using synchrotron light and taking advantage of the dramatic improvement of the 2D-detectors and computerised image processing. The rough data is constituted by a number (~300) of images recorded along the diffraction (“rocking”) curve. If the quality of the crystal is such that a one-to-onerelation between a pixel of the detector and a voxel within the crystal can be established (this approximation is very well fulfilled if the local mosaic spread of the voxel is < 1 mradian), a software we developped provides, from the each rocking curve recorded on each of the pixels of the detector, not only the “voxel” integrated intensity (the only data provided by the previous techniques) but also its “mosaic spread” (FWHM) and peak position. We will show, based on many examples, that this new data, never recorded before, open the field to a highly enhanced characterization of the crystal and deposited layers. These examples include the characterization of dislocations and twins occurring during silicon growth, various growth features in Al203, GaNand CdTe (where the diffraction displays the Borrmannanomalous absorption, which leads to a new type of images), and the characterisation of the defects within deposited layers, or their effect on the substrate. We could also observe (due to the very high sensitivity of the setup installed on BM05, which allows revealing these faint effects) that, when dealing with very perfect crystals, the Kato’s interference fringes predicted by dynamical theory are also associated with very small modifications of the local FWHM and peak position (of the order of the µradian). This rather unexpected (at least for us) result appears to be in keeping with preliminary dynamical theory calculations.

Keywords: rocking curve imaging, X-ray diffraction, defect, distortion

Procedia PDF Downloads 113
1177 Community-Based Assessment Approach to Empower Child with Disabilities: Institutional Study on Deaf Art Community in Yogyakarta, Indonesia

Authors: Mukhamad Fatkhullah, Arfan Fadli, Marini Kristina Situmeang, Siti Hazar Sitorus

Abstract:

The emergence of a community of people with disabilities along with the various works produced has made great progress to open the public eye to their existence in society. This study focuses attention on a community that is suspected to be one of the pioneers in pursuing the movement. It is Deaf Art Community (DAC), a community of persons with disabilities based in Yogyakarta, with deaf and speech-impaired members who use sign language in everyday communication. Knowing the movement of disabled communities is a good thing, the description of the things behind it then important to know as the basis for initiating similar movements. This research focuses on the question of how community of people with disabilities begin to take shape in different regions and interact with collaborative events. Qualitative method with in-depth interview as data collection techniques was used to describe the process of formation and the emergence of community. The analytical unit in the study initially focuses on the subject in the community, but in the process, it develops to institutional analysis. Therefore some informants were determined purposively and expanded using the snowball technique. The theory used in this research is Phenomenology of Alfred Schutz to be able to see reality from the subject and institutional point of view. The results of this study found that the community is formed because the existing educational institutions (both SLB and inclusion) are less able to empower and make children with disabilities become equal with the society. Through the SLB, the presence of children with disabilities becomes isolated from the society, especially in children of his or her age. Therefore, discrimination and labeling will never be separated from society's view. Meanwhile, facilities for the basic needs of children with disabilities can not be fully provided. Besides that, the guarantee of discrimination, glances, and unpleasant behavior from children without disability does not exist, which then indicates that the existing inclusion schools offer only symbolic acceptance. Thus, both in SLB and Inclusive Schools can not empower children with disabilities. Community-based assistance, in this case, has become an alternative to actually empowering children with disabilities. Not only giving them a place to interact, through the same community, children with disabilities will be guided to discover their talents and develop their potential to be self-reliant in the future.

Keywords: children with disabilities, community-based assessment, community empowerment, social equity

Procedia PDF Downloads 244
1176 Speckle-Based Phase Contrast Micro-Computed Tomography with Neural Network Reconstruction

Authors: Y. Zheng, M. Busi, A. F. Pedersen, M. A. Beltran, C. Gundlach

Abstract:

X-ray phase contrast imaging has shown to yield a better contrast compared to conventional attenuation X-ray imaging, especially for soft tissues in the medical imaging energy range. This can potentially lead to better diagnosis for patients. However, phase contrast imaging has mainly been performed using highly brilliant Synchrotron radiation, as it requires high coherence X-rays. Many research teams have demonstrated that it is also feasible using a laboratory source, bringing it one step closer to clinical use. Nevertheless, the requirement of fine gratings and high precision stepping motors when using a laboratory source prevents it from being widely used. Recently, a random phase object has been proposed as an analyzer. This method requires a much less robust experimental setup. However, previous studies were done using a particular X-ray source (liquid-metal jet micro-focus source) or high precision motors for stepping. We have been working on a much simpler setup with just small modification of a commercial bench-top micro-CT (computed tomography) scanner, by introducing a piece of sandpaper as the phase analyzer in front of the X-ray source. However, it needs a suitable algorithm for speckle tracking and 3D reconstructions. The precision and sensitivity of speckle tracking algorithm determine the resolution of the system, while the 3D reconstruction algorithm will affect the minimum number of projections required, thus limiting the temporal resolution. As phase contrast imaging methods usually require much longer exposure time than traditional absorption based X-ray imaging technologies, a dynamic phase contrast micro-CT with a high temporal resolution is particularly challenging. Different reconstruction methods, including neural network based techniques, will be evaluated in this project to increase the temporal resolution of the phase contrast micro-CT. A Monte Carlo ray tracing simulation (McXtrace) was used to generate a large dataset to train the neural network, in order to address the issue that neural networks require large amount of training data to get high-quality reconstructions.

Keywords: micro-ct, neural networks, reconstruction, speckle-based x-ray phase contrast

Procedia PDF Downloads 234