Search results for: nuclear decay data evaluation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29498

Search results for: nuclear decay data evaluation

28328 Risk Factors and Biomarkers for the Recurrence of Ovarian Endometrioma: About the Immunoreactivity of Progesterone Receptor Isoform B and Nuclear Factor Kappa B.

Authors: Ae Ra Han, Taek Hoo Lee, Sun Zoo Kim, Hwa Young Lee

Abstract:

Introduction: Ovarian endometrioma is one of the important causes of poor ovarian reserve and up to half of them have recurred. However, the treatment for recurrence prevention has limited efficiency and repeated surgical management makes worsen the ovarian reserve. To find better management for recurrence prevention, we investigated risk factors and biomarkers for the recurrence of ovarian endometrioma. Methods: The medical records of women with the history of surgical dissection for ovarian endometrioma were collected. After exclusion of the cases with concurrent hysterectomy, been menopaused during follow-up, incomplete medical record, and loss of follow-up, a total of 134 women were enrolled. Immunohistochemical staining for progesterone receptor isoform B (PR-B) and nuclear factor kappa B (NFκB) was done with the fixed tissue blocks of their endometriomas which were collected at the time of surgery. Results: Severity of dysmenorrhea and co-existence of adenomyosis had significant correlation with recurrence of endometrioma. Increased PR-B (P = .041) and decreased NFκB (P = .036) immunoreactivity were found in recurrent group. Serum CA-125 level at the time of recurrence was higher than the highest level of CA-125 during follow-up in unrecurred group (55.6 vs. 21.3 U/mL, P = .014). Conclusion: We found that the severity of dysmenorrhea and coexistence of adenomyosis are risk factors for recurrence of ovarian endometrioma, and serial follow-up of CA-125 is effective to detect and prevent the recurrence. However, to determine the possibility of immunoreactivity of PR-B and NFκB as biomarkers for ovarian endometrioma, further studies of various races and large numbers with prospective design are needed.

Keywords: endometriosis, recurrence, biomarker, risk factor

Procedia PDF Downloads 540
28327 A Study on the Quantitative Evaluation Method of Asphalt Pavement Condition through the Visual Investigation

Authors: Sungho Kim, Jaechoul Shin, Yujin Baek

Abstract:

In recent years, due to the environmental impacts and time factor, etc., various type of pavement deterioration is increasing rapidly such as crack, pothole, rutting and roughness degradation. The Ministry of Land, Infrastructure and Transport maintains regular pavement condition of the highway and the national highway using the pavement condition survey equipment and structural survey equipment in Korea. Local governments that maintain local roads, farm roads, etc. are difficult to maintain the pavement condition using the pavement condition survey equipment depending on economic conditions, skills shortages and local conditions such as narrow roads. This study presents a quantitative evaluation method of the pavement condition through the visual inspection to overcome these problems of roads managed by local governments. It is difficult to evaluate rutting and roughness with the naked eye. However, the condition of cracks can be evaluated with the naked eye. Linear cracks (m), area cracks (m²) and potholes (number, m²) were investigated with the naked eye every 100 meters for survey the cracks. In this paper, crack ratio was calculated using the results of the condition of cracks and pavement condition was evaluated by calculated crack ratio. The pavement condition survey equipment also investigated the pavement condition in the same section in order to evaluate the reliability of pavement condition evaluation by the calculated crack ratio. The pavement condition was evaluated through the SPI (Seoul Pavement Index) and calculated crack ratio using results of field survey. The results of a comparison between 'the SPI considering only crack ratio' and 'the SPI considering rutting and roughness either' using the equipment survey data showed a margin of error below 5% when the SPI is less than 5. The SPI 5 is considered the base point to determine whether to maintain the pavement condition. It showed that the pavement condition can be evaluated using only the crack ratio. According to the analysis results of the crack ratio between the visual inspection and the equipment survey, it has an average error of 1.86%(minimum 0.03%, maximum 9.58%). Economically, the visual inspection costs only 10% of the equipment survey and will also help the economy by creating new jobs. This paper advises that local governments maintain the pavement condition through the visual investigations. However, more research is needed to improve reliability. Acknowledgment: The author would like to thank the MOLIT (Ministry of Land, Infrastructure, and Transport). This work was carried out through the project funded by the MOLIT. The project name is 'development of 20mm grade for road surface detecting roadway condition and rapid detection automation system for removal of pothole'.

Keywords: asphalt pavement maintenance, crack ratio, evaluation of asphalt pavement condition, SPI (Seoul Pavement Index), visual investigation

Procedia PDF Downloads 153
28326 Formal Asymptotic Stability Guarantees, Analysis, and Evaluation of Nonlinear Controlled Unmanned Aerial Vehicle for Trajectory Tracking

Authors: Soheib Fergani

Abstract:

This paper concerns with the formal asymptotic stability guarantees, analysis and evaluation of a nonlinear controlled unmanned aerial vehicles (uav) for trajectory tracking purpose. As the system has been recognised as an under-actuated non linear system, the control strategy has been oriented towards a hierarchical control. The dynamics of the system and the mission purpose make it mandatory to provide an absolute proof of the vehicle stability during the maneuvers. For this sake, this work establishes the complete theoretical proof for an implementable control oriented strategy that asymptotically stabilizes (GAS and LISS) the system and has never been provided in previous works. The considered model is reorganized into two partly decoupled sub-systems. The concidered control strategy is presented into two stages: the first sub-system is controlled by a nonlinear backstepping controller that generates the desired control inputs to stabilize the second sub-system. This methodology is then applied to a harware in the loop uav simulator (SiMoDrones) that reproduces the realistic behaviour of the uav in an indoor environment has been performed to show the efficiency of the proposed strategy.

Keywords: UAV application, trajectory tracking, backstepping, sliding mode control, input to state stability, stability evaluation

Procedia PDF Downloads 37
28325 Evaluation of the Urban Regeneration Project: Land Use Transformation and SNS Big Data Analysis

Authors: Ju-Young Kim, Tae-Heon Moon, Jung-Hun Cho

Abstract:

Urban regeneration projects have been actively promoted in Korea. In particular, Jeonju Hanok Village is evaluated as one of representative cases in terms of utilizing local cultural heritage sits in the urban regeneration project. However, recently, there has been a growing concern in this area, due to the ‘gentrification’, caused by the excessive commercialization and surging tourists. This trend was changing land and building use and resulted in the loss of identity of the region. In this regard, this study analyzed the land use transformation between 2010 and 2016 to identify the commercialization trend in Jeonju Hanok Village. In addition, it conducted SNS big data analysis on Jeonju Hanok Village from February 14th, 2016 to March 31st, 2016 to identify visitors’ awareness of the village. The study results demonstrate that rapid commercialization was underway, unlikely the initial intention, so that planners and officials in city government should reconsider the project direction and rebuild deliberate management strategies. This study is meaningful in that it analyzed the land use transformation and SNS big data to identify the current situation in urban regeneration area. Furthermore, it is expected that the study results will contribute to the vitalization of regeneration area.

Keywords: land use, SNS, text mining, urban regeneration

Procedia PDF Downloads 280
28324 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 94
28323 Evaluation of the MCFLIRT Correction Algorithm in Head Motion from Resting State fMRI Data

Authors: V. Sacca, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

In the last few years, resting-state functional MRI (rs-fMRI) was widely used to investigate the architecture of brain networks by investigating the Blood Oxygenation Level Dependent response. This technique represented an interesting, robust and reliable approach to compare pathologic and healthy subjects in order to investigate neurodegenerative diseases evolution. On the other hand, the elaboration of rs-fMRI data resulted to be very prone to noise due to confounding factors especially the head motion. Head motion has long been known to be a source of artefacts in task-based functional MRI studies, but it has become a particularly challenging problem in recent studies using rs-fMRI. The aim of this work was to evaluate in MS patients a well-known motion correction algorithm from the FMRIB's Software Library - MCFLIRT - that could be applied to minimize the head motion distortions, allowing to correctly interpret rs-fMRI results.

Keywords: head motion correction, MCFLIRT algorithm, multiple sclerosis, resting state fMRI

Procedia PDF Downloads 195
28322 Introducing a Video-Based E-Learning Module to Improve Disaster Preparedness at a Tertiary Hospital in Oman

Authors: Ahmed Al Khamisi

Abstract:

The Disaster Preparedness Standard (DPS) is one of the elements that is evaluated by the Accreditation Canada International (ACI). ACI emphasizes to train and educate all staff, including service providers and senior leaders, on emergency and disaster preparedness upon the orientation and annually thereafter. Lack of awareness and deficit of knowledge among the healthcare providers about DPS have been noticed in a tertiary hospital where ACI standards were implemented. Therefore, this paper aims to introduce a video-based e-learning (VB-EL) module that explains the hospital’s disaster plan in a simple language which will be easily accessible to all healthcare providers through the hospital’s website. The healthcare disaster preparedness coordinator in the targeted hospital will be responsible to ensure that VB-EL is ready by 25 April 2019. This module will be developed based on the Kirkpatrick evaluation method. In fact, VB-EL combines different data forms such as images, motion, sounds, text in a complementary fashion which will suit diverse learning styles and individual learning pace of healthcare providers. Moreover, the module can be adjusted easily than other tools to control the information that healthcare providers receive. It will enable healthcare providers to stop, rewind, fast-forward, and replay content as many times as needed. Some anticipated limitations in the development of this module include challenges of preparing VB-EL content and resistance from healthcare providers.

Keywords: Accreditation Canada International, Disaster Preparedness Standard, Kirkpatrick evaluation method, video-based e-learning

Procedia PDF Downloads 137
28321 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions

Authors: K. Hardy, A. Maurushat

Abstract:

Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.

Keywords: big data, open data, productivity, data governance

Procedia PDF Downloads 352
28320 A Randomized Control Trial Intervention to Combat Childhood Obesity in Negeri Sembilan: The Hebat! Program

Authors: Siti Sabariah Buhari, Ruzita Abdul Talib, Poh Bee Koon

Abstract:

This study aims to develop and evaluate an intervention to improve eating habits, active lifestyle and weight status of overweight and obese children in Negeri Sembilan. The H.E.B.A.T! Program involved children, parents, and school and focused on behaviour and environment modification to achieve its goal. The intervention consists of H.E.B.A.T! Camp, parent’s workshop and school-based activities. A total of 21 children from intervention school and 22 children from control school who had BMI for age Z-score ≥ +1SD participated in the study. Mean age of subjects was 10.8 ± 0.3 years old. Four phases were included in the development of the intervention. Evaluation of intervention was conducted through process, impact and outcome evaluation. Process evaluation found that intervention program was implemented successfully with minimal modification and without having any technical problems. Impact and outcome evaluation was assessed based on dietary intake, average step counts, BMI for age z-score, body fat percentage and waist circumference at pre-intervention (T0), post-intervention 1 (T1) and post-intervention 2 (T2). There was significant reduction in energy (14.8%) and fat (21.9%) intakes (at p < 0.05) at post-intervention 1 (T1) in intervention group. By controlling for sex as covariate, there was significant intervention effect for average step counts, BMI for age z-score and waist circumference (p < 0.05). In conclusion, the intervention made an impact on positive behavioural intentions and improves weight status of the children. It is expected that the HEBAT! Program could be adopted and implemented by the government and private sector as well as policy-makers in formulating childhood obesity intervention.

Keywords: childhood obesity, diet, obesity intervention, physical activity

Procedia PDF Downloads 277
28319 Evaluation of Free Technologies as Tools for Business Process Management

Authors: Julio Sotomayor, Daniel Yucra, Jorge Mayhuasca

Abstract:

The article presents an evaluation of free technologies for business process automation, with emphasis only on tools compatible with the general public license (GPL). The compendium of technologies was based on promoting a service-oriented enterprise architecture (SOA) and the establishment of a business process management system (BPMS). The methodology for the selection of tools was Agile UP. This proposal allows businesses to achieve technological sovereignty and independence, in addition to the promotion of service orientation and the development of free software based on components.

Keywords: BPM, BPMS suite, open-source software, SOA, enterprise architecture, business process management

Procedia PDF Downloads 270
28318 Application of WHO's Guideline to Evaluating Apps for Smoking Cessation

Authors: Suin Seo, Sung-Il Cho

Abstract:

Background: The use of mobile apps for smoking cessation has grown exponentially in recent years. Yet, there were limited researches which evaluated the quality of smoking cessation apps to our knowledge. In most cases, a clinical practice guideline which is focused on clinical physician was used as an evaluation tool. Objective: The objective of this study was to develop a user-centered measure for quality of mobile smoking cessation apps. Methods: A literature search was conducted to identify articles containing explicit smoking cessation guideline for smoker published until January 2018. WHO’s guide for tobacco users to quit was adopted for evaluation tool which assesses smoker-oriented contents of smoking cessation apps. Compared to the clinical practice guideline, WHO guideline was designed for smokers (non-specialist). On the basis of existing criteria which was developed based on 2008 clinical practice guideline for Treating Tobacco Use and Dependence, evaluation tool was modified and developed by an expert panel. Results: There were five broad categories of criteria that were identified including five objective quality scales: enhancing motivation, assistance with a planning and making quit attempts, preparation for relapse, self-efficacy, connection to smoking. Enhancing motivation and assistance with planning and making quit attempts were similar to contents of clinical practice guideline, but preparation for relapse, self-efficacy and connection to smoking (environment or habit which reminds of smoking) only existed on WHO guideline. WHO guideline had more user-centered elements than clinical guideline. Especially, self-efficacy is the most important determinant of behavior change in accordance with many health behavior change models. With the WHO guideline, it is now possible to analyze the content of the app in the light of a health participant, not a provider. Conclusion: The WHO guideline evaluation tool is a simple, reliable and smoker-centered tool for assessing the quality of mobile smoking cessation apps. It can also be used to provide a checklist for the development of new high-quality smoking cessation apps.

Keywords: smoking cessation, evaluation, mobile application, WHO, guideline

Procedia PDF Downloads 172
28317 Decision-Making in Higher Education: Case Studies Demonstrating the Value of Institutional Effectiveness Tools

Authors: Carolinda Douglass

Abstract:

Institutional Effectiveness (IE) is the purposeful integration of functions that foster student success and support institutional performance. IE is growing rapidly within higher education as it is increasingly viewed by higher education administrators as a beneficial approach for promoting data-informed decision-making in campus-wide strategic planning and execution of strategic initiatives. Specific IE tools, including, but not limited to, project management; impactful collaboration and communication; commitment to continuous quality improvement; and accountability through rigorous evaluation; are gaining momentum under the auspices of IE. This research utilizes a case study approach to examine the use of these IE tools, highlight successes of this use, and identify areas for improvement in the implementation of IE tools within higher education. The research includes three case studies: (1) improving upon academic program review processes including the assessment of student learning outcomes as a core component of program quality; (2) revising an institutional vision, mission, and core values; and (3) successfully navigating an institution-wide re-accreditation process. Several methods of data collection are embedded within the case studies, including surveys, focus groups, interviews, and document analyses. Subjects of these methods include higher education administrators, faculty, and staff. Key findings from the research include areas of success and areas for improvement in the use of IE tools associated with specific case studies as well as aggregated results across case studies. For example, the use of case management proved useful in all of the case studies, while rigorous evaluation did not uniformly provide the value-added that was expected by higher education decision-makers. The use of multiple IE tools was shown to be consistently useful in decision-making when applied with appropriate awareness of and sensitivity to core institutional culture (for example, institutional mission, local environments and communities, disciplinary distinctions, and labor relations). As IE gains a stronger foothold in higher education, leaders in higher education can make judicious use of IE tools to promote better decision-making and secure improved outcomes of strategic planning and the execution of strategic initiatives.

Keywords: accreditation, data-informed decision-making, higher education management, institutional effectiveness tools, institutional mission, program review, strategic planning

Procedia PDF Downloads 99
28316 Development and Performance Evaluation of a Gladiolus Planter in Field for Planting Corms

Authors: T. P. Singh, Vijay Gautam

Abstract:

Gladiolus is an important cash crop and is grown mainly for its elegant spikes. Traditionally the gladiolus corms are planted manually which is very tedious, time consuming and labor intensive operation. So far, there is no planter available for planting of gladiolus corms. With a view to mechanize the planting operation of this horticultural crop, a prototype of 4-row gladiolus planter was developed and its performance was evaluated in-situ condition. Cup-chain type metering device was used to singulate the gladiolus corms while planting. Three levels of corm spacing viz 15, 20 and 25 cm and four levels of forward speed viz 1.0, 1.5, 2.0 and 2.5 km/h was taken as evaluation parameter for the planter. The performance indicators namely corm spacing in each row, coefficient of uniformity, missing index, multiple index, quality of feed index, number of corms per meter length, mechanical damage to the corms etc. were determined during the field test. The data was statistically analyzed using Completely Randomized Design (CRD) for testing the significance of the parameters. The result indicated that planter was able to drop the corms at required nominal spacing with minor variations. The highest deviation from the mean corm spacing was observed as 3.53 cm with maximum coefficient of variation as 13.88%. The highest missing and quality of feed indexes were observed as 6.33% and 97.45% respectively with no multiples. The performance of the planter was observed better at lower forward speed and wider corm spacing. The field capacity of the planter was found as 0.103 ha/h with an observed field efficiency of 76.57%.

Keywords: coefficient of uniformity, corm spacing, gladiolus planter, mechanization

Procedia PDF Downloads 221
28315 Evaluation of Video Quality Metrics and Performance Comparison on Contents Taken from Most Commonly Used Devices

Authors: Pratik Dhabal Deo, Manoj P.

Abstract:

With the increasing number of social media users, the amount of video content available has also significantly increased. Currently, the number of smartphone users is at its peak, and many are increasingly using their smartphones as their main photography and recording devices. There have been a lot of developments in the field of Video Quality Assessment (VQA) and metrics like VMAF, SSIM etc. are said to be some of the best performing metrics, but the evaluation of these metrics is dominantly done on professionally taken video contents using professional tools, lighting conditions etc. No study particularly pinpointing the performance of the metrics on the contents taken by users on very commonly available devices has been done. Datasets that contain a huge number of videos from different high-end devices make it difficult to analyze the performance of the metrics on the content from most used devices even if they contain contents taken in poor lighting conditions using lower-end devices. These devices face a lot of distortions due to various factors since the spectrum of contents recorded on these devices is huge. In this paper, we have presented an analysis of the objective VQA metrics on contents taken only from most used devices and their performance on them, focusing on full-reference metrics. To carry out this research, we created a custom dataset containing a total of 90 videos that have been taken from three most commonly used devices, and android smartphone, an IOS smartphone and a DSLR. On the videos taken on each of these devices, the six most common types of distortions that users face have been applied on addition to already existing H.264 compression based on four reference videos. These six applied distortions have three levels of degradation each. A total of the five most popular VQA metrics have been evaluated on this dataset and the highest values and the lowest values of each of the metrics on the distortions have been recorded. Finally, it is found that blur is the artifact on which most of the metrics didn’t perform well. Thus, in order to understand the results better the amount of blur in the data set has been calculated and an additional evaluation of the metrics was done using HEVC codec, which is the next version of H.264 compression, on the camera that proved to be the sharpest among the devices. The results have shown that as the resolution increases, the performance of the metrics tends to become more accurate and the best performing metric among them is VQM with very few inconsistencies and inaccurate results when the compression applied is H.264, but when the compression is applied is HEVC, SSIM and VMAF have performed significantly better.

Keywords: distortion, metrics, performance, resolution, video quality assessment

Procedia PDF Downloads 189
28314 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 92
28313 The Production of Biofertilizer from Naturally Occurring Microorganisms by Using Nuclear Technologies

Authors: K. S. Al-Mugren, A. Yahya, S. Alodah, R. Alharbi, S. H. Almsaid , A. Alqahtani, H. Jaber, A. Basaqer, N. Alajra, N. Almoghati, A. Alsalman, Khalid Alharbi

Abstract:

Context: The production of biofertilizers from naturally occurring microorganisms is an area of research that aims to enhance agricultural practices by utilizing local resources. This research project focuses on isolating and screening indigenous microorganisms with PK-fixing and phosphate solubilizing characteristics from local sources. Research Aim: The aim of this project is to develop a biofertilizer product using indigenous microorganisms and composted agro waste as a carrier. The objective is to enhance crop productivity and soil fertility through the application of biofertilizers. Methodology: The research methodology includes several key steps. Firstly, indigenous microorganisms will be isolated from local resources using the ten-fold serial dilutions technique. Screening assays will be conducted to identify microorganisms with phosphate solubilizing and PK-fixing activities. Agro-waste materials will be collected from local agricultural sources, and composting experiments will be conducted to convert them into organic matter-rich compost. Physicochemical analysis will be performed to assess the composition of the composted agro-waste. Gamma and X-ray irradiation will be used to sterilize the carrier material. The sterilized carrier will be tested for sterility using the ten-fold serial dilutions technique. Finally, selected indigenous microorganisms will be developed into biofertilizer products. Findings: The research aims to find suitable indigenous microorganisms with phosphate solubilizing and PK-fixing characteristics for biofertilizer production. Additionally, the research aims to assess the suitability of composted agro waste as a carrier for biofertilizers. The impact of gamma irradiation sterilization on pathogen elimination will also be investigated. Theoretical Importance: This research contributes to the understanding of utilizing indigenous microorganisms and composted agro waste for biofertilizer production. It expands knowledge on the potential benefits of biofertilizers in enhancing crop productivity and soil fertility. Data Collection and Analysis Procedures: The data collection process involves isolating indigenous microorganisms, conducting screening assays, collecting and composting agro waste, analyzing the physicochemical composition of composted agro waste, and testing carrier sterilization. The analysis procedures include assessing the abilities of indigenous microorganisms, evaluating the composition of composted agro waste, and determining the sterility of the carrier material. Conclusion: The research project aims to develop biofertilizer products using indigenous microorganisms and composted agro waste as a carrier. Through the isolation and screening of indigenous microorganisms, the project aims to enhance crop productivity and soil fertility by utilizing local resources. The research findings will contribute to the understanding of the suitability of composted agro waste as a carrier and the efficacy of gamma irradiation sterilization. The research outcomes will have theoretical importance in the field of biofertilizer production and agricultural practices.

Keywords: biofertilizer, microorganisms, agro waste, nuclear technologies

Procedia PDF Downloads 95
28312 Evaluation of Academic Research Projects Using the AHP and TOPSIS Methods

Authors: Murat Arıbaş, Uğur Özcan

Abstract:

Due to the increasing number of universities and academics, the fund of the universities for research activities and grants/supports given by government institutions have increased number and quality of academic research projects. Although every academic research project has a specific purpose and importance, limited resources (money, time, manpower etc.) require choosing the best ones from all (Amiri, 2010). It is a pretty hard process to compare and determine which project is better such that the projects serve different purposes. In addition, the evaluation process has become complicated since there are more than one evaluator and multiple criteria for the evaluation (Dodangeh, Mojahed and Yusuff, 2009). Mehrez and Sinuany-Stern (1983) determined project selection problem as a Multi Criteria Decision Making (MCDM) problem. If a decision problem involves multiple criteria and objectives, it is called as a Multi Attribute Decision Making problem (Ömürbek & Kınay, 2013). There are many MCDM methods in the literature for the solution of such problems. These methods are AHP (Analytic Hierarchy Process), ANP (Analytic Network Process), TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluation), UTADIS (Utilities Additives Discriminantes), ELECTRE (Elimination et Choix Traduisant la Realite), MAUT (Multiattribute Utility Theory), GRA (Grey Relational Analysis) etc. Teach method has some advantages compared with others (Ömürbek, Blacksmith & Akalın, 2013). Hence, to decide which MCDM method will be used for solution of the problem, factors like the nature of the problem, types of choices, measurement scales, type of uncertainty, dependency among the attributes, expectations of decision maker, and quantity and quality of the data should be considered (Tavana & Hatami-Marbini, 2011). By this study, it is aimed to develop a systematic decision process for the grant support applications that are expected to be evaluated according to their scientific adequacy by multiple evaluators under certain criteria. In this context, project evaluation process applied by The Scientific and Technological Research Council of Turkey (TÜBİTAK) the leading institutions in our country, was investigated. Firstly in the study, criteria that will be used on the project evaluation were decided. The main criteria were selected among TÜBİTAK evaluation criteria. These criteria were originality of project, methodology, project management/team and research opportunities and extensive impact of project. Moreover, for each main criteria, 2-4 sub criteria were defined, hence it was decided to evaluate projects over 13 sub-criterion in total. Due to superiority of determination criteria weights AHP method and provided opportunity ranking great number of alternatives TOPSIS method, they are used together. AHP method, developed by Saaty (1977), is based on selection by pairwise comparisons. Because of its simple structure and being easy to understand, AHP is the very popular method in the literature for determining criteria weights in MCDM problems. Besides, the TOPSIS method developed by Hwang and Yoon (1981) as a MCDM technique is an alternative to ELECTRE method and it is used in many areas. In the method, distance from each decision point to ideal and to negative ideal solution point was calculated by using Euclidian Distance Approach. In the study, main criteria and sub-criteria were compared on their own merits by using questionnaires that were developed based on an importance scale by four relative groups of people (i.e. TUBITAK specialists, TUBITAK managers, academics and individuals from business world ) After these pairwise comparisons, weight of the each main criteria and sub-criteria were calculated by using AHP method. Then these calculated criteria’ weights used as an input in TOPSİS method, a sample consisting 200 projects were ranked on their own merits. This new system supported to opportunity to get views of the people that take part of project process including preparation, evaluation and implementation on the evaluation of academic research projects. Moreover, instead of using four main criteria in equal weight to evaluate projects, by using weighted 13 sub-criteria and decision point’s distance from the ideal solution, systematic decision making process was developed. By this evaluation process, new approach was created to determine importance of academic research projects.

Keywords: Academic projects, Ahp method, Research projects evaluation, Topsis method.

Procedia PDF Downloads 577
28311 Remote Sensing through Deep Neural Networks for Satellite Image Classification

Authors: Teja Sai Puligadda

Abstract:

Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.

Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss

Procedia PDF Downloads 139
28310 Changing Behaviour in the Digital Era: A Concrete Use Case from the Domain of Health

Authors: Francesca Spagnoli, Shenja van der Graaf, Pieter Ballon

Abstract:

Humans do not behave rationally. We are emotional, easily influenced by others, as well as by our context. The study of human behaviour became a supreme endeavour within many academic disciplines, including economics, sociology, and clinical and social psychology. Understanding what motivates humans and triggers them to perform certain activities, and what it takes to change their behaviour, is central both for researchers and companies, as well as policy makers to implement efficient public policies. While numerous theoretical approaches for diverse domains such as health, retail, environment have been developed, the methodological models guiding the evaluation of such research have reached for a long time their limits. Within this context, digitisation, the Information and communication technologies (ICT) and wearable, the Internet of Things (IoT) connecting networks of devices, and new possibilities to collect and analyse massive amounts of data made it possible to study behaviour from a realistic perspective, as never before. Digital technologies make it possible to (1) capture data in real-life settings, (2) regain control over data by capturing the context of behaviour, and (3) analyse huge set of information through continuous measurement. Within this complex context, this paper describes a new framework for initiating behavioural change, capitalising on the digital developments in applied research projects and applicable both to academia, enterprises and policy makers. By applying this model, behavioural research can be conducted to address the issues of different domains, such as mobility, environment, health or media. The Modular Behavioural Analysis Approach (MBAA) is here described and firstly validated through a concrete use case within the domain of health. The results gathered have proven that disclosing information about health in connection with the use of digital apps for health, can be a leverage for changing behaviour, but it is only a first component requiring further follow-up actions. To this end, a clear definition of different 'behavioural profiles', towards which addressing several typologies of interventions, it is essential to effectively enable behavioural change. In the refined version of the MBAA a strong focus will rely on defining a methodology for shaping 'behavioural profiles' and related interventions, as well as the evaluation of side-effects on the creation of new business models and sustainability plans.

Keywords: behavioural change, framework, health, nudging, sustainability

Procedia PDF Downloads 207
28309 A Systematic Review on Challenges in Big Data Environment

Authors: Rimmy Yadav, Anmol Preet Kaur

Abstract:

Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.

Keywords: big data, privacy, data management, network and energy consumption

Procedia PDF Downloads 286
28308 Web Application for Evaluating Tests in Distance Learning Systems

Authors: Bogdan Walek, Vladimir Bradac, Radim Farana

Abstract:

Distance learning systems offer useful methods of learning and usually contain final course test or another form of test. The paper proposes web application for evaluating tests using expert system in distance learning systems. Proposed web application is appropriate for didactic tests or tests with results for subsequent studying follow-up courses. Web application works with test questions and uses expert system and LFLC tool for test evaluation. After test evaluation the results are visualized and shown to student.

Keywords: distance learning, test, uncertainty, fuzzy, expert system, student

Procedia PDF Downloads 467
28307 Monitoring of Quantitative and Qualitative Changes in Combustible Material in the Białowieża Forest

Authors: Damian Czubak

Abstract:

The Białowieża Forest is a very valuable natural area, included in the World Natural Heritage at UNESCO, where, due to infestation by the bark beetle (Ips typographus), norway spruce (Picea abies) have deteriorated. This catastrophic scenario led to an increase in fire danger. This was due to the occurrence of large amounts of dead wood and grass cover, as light penetrated to the bottom of the stands. These factors in a dry state are materials that favour the possibility of fire and the rapid spread of fire. One of the objectives of the study was to monitor the quantitative and qualitative changes of combustible material on the permanent decay plots of spruce stands from 2012-2022. In addition, the size of the area with highly flammable vegetation was monitored and a classification of the stands of the Białowieża Forest by flammability classes was made. The key factor that determines the potential fire hazard of a forest is combustible material. Primarily its type, quantity, moisture content, size and spatial structure. Based on the inventory data on the areas of forest districts in the Białowieża Forest, the average fire load and its changes over the years were calculated. The analysis was carried out taking into account the changes in the health status of the stands and sanitary operations. The quantitative and qualitative assessment of fallen timber and fire load of ground cover used the results of the 2019 and 2021 inventories. Approximately 9,000 circular plots were used for the study. An assessment was made of the amount of potential fuel, understood as ground cover vegetation and dead wood debris. In addition, monitoring of areas with vegetation that poses a high fire risk was conducted using data from 2019 and 2021. All sub-areas were inventoried where vegetation posing a specific fire hazard represented at least 10% of the area with species characteristic of that cover. In addition to the size of the area with fire-prone vegetation, a very important element is the size of the fire load on the indicated plots. On representative plots, the biomass of the land cover was measured on an area of 10 m2 and then the amount of biomass of each component was determined. The resulting element of variability of ground covers in stands was their flammability classification. The classification developed made it possible to track changes in the flammability classes of stands over the period covered by the measurements.

Keywords: classification, combustible material, flammable vegetation, Norway spruce

Procedia PDF Downloads 76
28306 Survey on Big Data Stream Classification by Decision Tree

Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi

Abstract:

Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.

Keywords: big data, data streams, classification, decision tree

Procedia PDF Downloads 501
28305 Nuclear Mitochondrial Pseudogenes in Anastrepha fraterculus Complex

Authors: Pratibha Srivastava, Ayyamperumal Jeyaprakash, Gary Steck, Jason Stanley, Leroy Whilby

Abstract:

Exotic, invasive tephritid fruit flies (Diptera: Tephritidae) are a major threat to fruit and vegetable industries in the United States. The establishment of pest fruit fly in the agricultural industries and produce severe ecological and economic impacts on agricultural diversification and trade. Detection and identification of these agricultural pests in a timely manner will facilitate the possibility of eradication from newly invaded areas. Identification of larval stages to species level is difficult, but is required to determine pest loads and their pathways into the United States. The aim of this study is the New World genus, Anastrepha which includes pests of major economic importance. Mitochondrial cytochrome c oxidase I (COI) gene sequences were amplified from Anastrepha fraterculus specimens collected from South America (Ecuador and Peru). Phylogenetic analysis was performed to characterize the Anastrepha fraterculus complex at a molecular level. During phylogenetics analysis numerous nuclear mitochondrial pseudogenes (numts) were discovered in different specimens. The numts are nonfunctional copies of the mtDNA present in the nucleus and are easily coamplified with the mitochondrial COI gene copy by using conserved universal primers. This is problematic for DNA Barcoding, which attempts to characterize all living organisms by using the COI gene. This study is significant for national quarantine use, as morphological diagnostics to separate larvae of the various members remain poorly developed.

Keywords: tephritid, Anastrepha fraterculus, COI, numts

Procedia PDF Downloads 216
28304 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 367
28303 The Nursing Experience in a Stroke Patient after Lumbar Surgery at Surgical Intensive Care Unit

Authors: Yu-Chieh Chen, Kuei-Feng Shen, Chia-Ling Chao

Abstract:

The purpose of this report was to present the nursing experience and case of an unexpected cerebellar hemorrhagic stroke with acute hydrocephalus patient after lumbar spine surgery. The patient had been suffering from an emergent external ventricular drainage and stayed in the Surgical Intensive Care Unit from July 8, 2016, to July 22, 2016. During the period of the case, the data were collected for attendance, evaluation, observation, interview, searching medical record, etc. An integral evaluation of the patient's physiological 'psychological' social and spiritual states was also noted. The author noticed the following major nursing problems including ineffective cerebral perfusion 'physical activity dysfunction' family resource preparation for disability. The author provided nursing care to maintain normal intracranial pressure, along with a well-therapeutic relationship and applied interdisciplinary medical/nursing team to draft an individualized and appropriate nursing plan for them to face the psychosocial impact of the patient disabilities. We also actively participated in the rehabilitation treatments to improve daily activity and confidence. This was deemed necessary to empower them to a more positive attitude in the future.

Keywords: family resourace preparation inability, hemorrhagic sroke, ineffective tissue cerebral perfusion, lumbar spine surgery

Procedia PDF Downloads 109
28302 A Review of Machine Learning for Big Data

Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.

Abstract:

Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.

Keywords: active learning, big data, deep learning, machine learning

Procedia PDF Downloads 417
28301 Modification of Electrical and Switching Characteristics of a Non Punch-Through Insulated Gate Bipolar Transistor by Gamma Irradiation

Authors: Hani Baek, Gwang Min Sun, Chansun Shin, Sung Ho Ahn

Abstract:

Fast neutron irradiation using nuclear reactors is an effective method to improve switching loss and short circuit durability of power semiconductor (insulated gate bipolar transistors (IGBT) and insulated gate transistors (IGT), etc.). However, not only fast neutrons but also thermal neutrons, epithermal neutrons and gamma exist in the nuclear reactor. And the electrical properties of the IGBT may be deteriorated by the irradiation of gamma. Gamma irradiation damages are known to be caused by Total Ionizing Dose (TID) effect and Single Event Effect (SEE), Displacement Damage. Especially, the TID effect deteriorated the electrical properties such as leakage current and threshold voltage of a power semiconductor. This work can confirm the effect of the gamma irradiation on the electrical properties of 600 V NPT-IGBT. Irradiation of gamma forms lattice defects in the gate oxide and Si-SiO2 interface of the IGBT. It was confirmed that this lattice defect acts on the center of the trap and affects the threshold voltage, thereby negatively shifted the threshold voltage according to TID. In addition to the change in the carrier mobility, the conductivity modulation decreases in the n-drift region, indicating a negative influence that the forward voltage drop decreases. The turn-off delay time of the device before irradiation was 212 ns. Those of 2.5, 10, 30, 70 and 100 kRad(Si) were 225, 258, 311, 328, and 350 ns, respectively. The gamma irradiation increased the turn-off delay time of the IGBT by approximately 65%, and the switching characteristics deteriorated.

Keywords: NPT-IGBT, gamma irradiation, switching, turn-off delay time, recombination, trap center

Procedia PDF Downloads 141
28300 Implementation of Synthesis and Quality Control Procedures of ¹⁸F-Fluoromisonidazole Radiopharmaceutical

Authors: Natalia C. E. S. Nascimento, Mercia L. Oliveira, Fernando R. A. Lima, Leonardo T. C. do Nascimento, Marina B. Silveira, Brigida G. A. Schirmer, Andrea V. Ferreira, Carlos Malamut, Juliana B. da Silva

Abstract:

Tissue hypoxia is a common characteristic of solid tumors leading to decreased sensitivity to radiotherapy and chemotherapy. In the clinical context, tumor hypoxia assessment employing the positron emission tomography (PET) tracer ¹⁸F-fluoromisonidazole ([¹⁸F]FMISO) is helpful for physicians for planning and therapy adjusting. The aim of this work was to implement the synthesis of 18F-FMISO in a TRACERlab® MXFDG module and also to establish the quality control procedure. [¹⁸F]FMISO was synthesized at Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN/Brazil) using an automated synthesizer (TRACERlab® MXFDG, GE) adapted for the production of [¹⁸F]FMISO. The FMISO chemical standard was purchased from ABX. 18O- enriched water was acquired from Center of Molecular Research. Reagent kits containing eluent solution, acetonitrile, ethanol, 2.0 M HCl solution, buffer solution, water for injections and [¹⁸F]FMISO precursor (dissolved in 2 ml acetonitrile) were purchased from ABX. The [¹⁸F]FMISO samples were purified by Solid Phase Extraction method. The quality requirements of [¹⁸F]FMISO are established in the European Pharmacopeia. According to that reference, quality control of [¹⁸F]FMISO should include appearance, pH, radionuclidic identity and purity, radiochemical identity and purity, chemical purity, residual solvents, bacterial endotoxins, and sterility. The duration of the synthesis process was 53 min, with radiochemical yield of (37.00 ± 0.01) % and the specific activity was more than 70 GBq/µmol. The syntheses were reproducible and showed satisfactory results. In relation to the quality control analysis, the samples were clear and colorless at pH 6.0. The spectrum emission, measured by using a High-Purity Germanium Detector (HPGe), presented a single peak at 511 keV and the half-life, determined by the decay method in an activimeter, was (111.0 ± 0.5) min, indicating no presence of radioactive contaminants, besides the desirable radionuclide (¹⁸F). The samples showed concentration of tetrabutylammonium (TBA) < 50μg/mL, assessed by visual comparison to TBA standard applied in the same thin layer chromatographic plate. Radiochemical purity was determined by high performance liquid chromatography (HPLC) and the results were 100%. Regarding the residual solvents tested, ethanol and acetonitrile presented concentration lower than 10% and 0.04%, respectively. Healthy female mice were injected via lateral tail vein with [¹⁸F]FMISO, microPET imaging studies (15 min) were performed after 2 h post injection (p.i), and the biodistribution was analyzed in five-time points (30, 60, 90, 120 and 180 min) after injection. Subsequently, organs/tissues were assayed for radioactivity with a gamma counter. All parameters of quality control test were in agreement to quality criteria confirming that [¹⁸F]FMISO was suitable for use in non-clinical and clinical trials, following the legal requirements for the production of new radiopharmaceuticals in Brazil.

Keywords: automatic radiosynthesis, hypoxic tumors, pharmacopeia, positron emitters, quality requirements

Procedia PDF Downloads 179
28299 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 269