Search results for: manual data inquiry
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25273

Search results for: manual data inquiry

25003 Finding Elves in Play Based Learning

Authors: Chloe L. Southern

Abstract:

If play is deemed to fulfill children’s social, emotional, and physical domains, as well as satisfy their natural curiosity and promote self-reflexivity, it is difficult to understand why play is not prioritized to the same extent for older children. This paper explores and discusses the importance of play-based learning as well as the preliminary implications beyond the realm of kindergarten. To further extend the inquiry, discussions pertaining to play-based learning are looked at through the lens of relevant methodologies and theories. Different education systems are looked at in certain areas of the world that lead to curiosities not only towards their play-based practices and curriculum but what ideologies they have that set them apart.

Keywords: 21ˢᵗ century learning, play-based learning, student-centered learning, transformative learning

Procedia PDF Downloads 67
25002 The First Japanese-Japanese Dictionary for Non-Japanese Using the Defining Vocabulary

Authors: Minoru Moriguchi

Abstract:

This research introduces the concept of a monolingual Japanese dictionary for non-native speakers of Japanese, whose temporal title is Dictionary of Contemporary Japanese for Advanced Learners (DCJAL). As the language market is very small compared with English, a monolingual Japanese dictionary for non-native speakers, containing sufficient entries, has not been published yet. In such a dictionary environment, Japanese-language learners are using bilingual dictionaries or monolingual Japanese dictionaries for Japanese people. This research started in 2017, as a project team which consists of four Japanese and two non-native speakers, all of whom are linguists of the Japanese language. The team has been trying to propose the concept of a monolingual dictionary for non-native speakers of Japanese and to provide the entry list, the definition samples, the list of defining vocabulary, and the writing manual. As the result of seven-year research, DCJAL has come to have 28,060 head words, 539 entry examples, 4,598-word defining vocabulary, and the writing manual. First, the number of the entry was determined as about 30,000, based on an experimental method using existing six dictionaries. To make the entry list satisfying this number, words suitable for DCJAL were extracted from the Tsukuba corpus of the Japanese language, and later the entry list was adjusted according to the experience as Japanese instructor. Among the head words of the entry list, 539 words were selected and added with lexicographical information such as proficiency level, pronunciation, writing system (hiragana, katakana, kanji, or alphabet), definition, example sentences, idiomatic expression, synonyms, antonyms, grammatical information, sociolinguistic information, and etymology. While writing the definition of the above 539 words, the list of the defining vocabulary was constructed, based on frequent vocabulary used in a Japanese monolingual dictionary. Although the concept of DCJAL has been almost perfected, it may need some more adjustment, and the research is continued.

Keywords: monolingual dictionary, the Japanese language, non-native speaker of Japanese, defining vocabulary

Procedia PDF Downloads 33
25001 A Fact-Finding Analysis on the Expulsions Made under Title 42 in Us

Authors: Avi Shrivastava

Abstract:

Title 42, an emergency health decree, has forced the federal authorities to turn away asylum seekers and all other border crossers since last year. When Title 42 was first deployed in immigration detention centers, where many migrants are held when they arrive at the U.S.-Mexico border, the Trump administration embraced it as a strategy. Expulsions Policy and New Border Challenges will be examined in regard to Title 42 concerns. Humanitarian measures for refugees arriving at the US-Mexico border are the focus of this article. To a large extent, this article addresses the implications of the United States' use of Title 42 in expelling refugees and the possible ramifications of doing away with it. A secondary data collecting strategy was used to gather the information for this study, allowing researchers to examine a large number of previously collected data sets. Information about Title 42 may be found in a variety of places, such as scholarly publications, newspapers, books, and the internet. The inquiry employed qualitative and explanatory research approaches. The claim that 1.7 million individuals were forced to leave the country as a result of it was withdrawn. Since CBP and ICE were limited in their ability to process deportees, it employed a very random patchwork technique in selecting the expelled individuals. As a consequence, repeat offenders, particularly those who were single, got a reduced punishment. The government will be compelled to focus on long-overdue but vital border enhancements if expulsions are halted. Title 42 provisions may help expedite the processing of asylum and other types of humanitarian relief. The government is prepared for an increase in arrivals, but ending the program would lead to a return to arrival levels seen during the Title 42 period.

Keywords: migrants, refugees, title 42, medical, trump administration

Procedia PDF Downloads 83
25000 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing

Authors: Rowan P. Martnishn

Abstract:

During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.

Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding

Procedia PDF Downloads 18
24999 Analysis of Big Data

Authors: Sandeep Sharma, Sarabjit Singh

Abstract:

As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.

Keywords: big data, unstructured data, volume, variety, velocity

Procedia PDF Downloads 540
24998 Investigations into the in situ Enterococcus faecalis Biofilm Removal Efficacies of Passive and Active Sodium Hypochlorite Irrigant Delivered into Lateral Canal of a Simulated Root Canal Model

Authors: Saifalarab A. Mohmmed, Morgana E. Vianna, Jonathan C. Knowles

Abstract:

The issue of apical periodontitis has received considerable critical attention. Bacteria is integrated into communities, attached to surfaces and consequently form biofilm. The biofilm structure provides bacteria with a series protection skills against, antimicrobial agents and enhances pathogenicity (e.g. apical periodontitis). Sodium hypochlorite (NaOCl) has become the irrigant of choice for elimination of bacteria from the root canal system based on its antimicrobial findings. The aim of the study was to investigate the effect of different agitation techniques on the efficacy of 2.5% NaOCl to eliminate the biofilm from the surface of the lateral canal using the residual biofilm, and removal rate of biofilm as outcome measures. The effect of canal complexity (lateral canal) on the efficacy of the irrigation procedure was also assessed. Forty root canal models (n = 10 per group) were manufactured using 3D printing and resin materials. Each model consisted of two halves of an 18 mm length root canal with apical size 30 and taper 0.06, and a lateral canal of 3 mm length, 0.3 mm diameter located at 3 mm from the apical terminus. E. faecalis biofilms were grown on the apical 3 mm and lateral canal of the models for 10 days in Brain Heart Infusion broth. Biofilms were stained using crystal violet for visualisation. The model halves were reassembled, attached to an apparatus and tested under a fluorescence microscope. Syringe and needle irrigation protocol was performed using 9 mL of 2.5% NaOCl irrigant for 60 seconds. The irrigant was either left stagnant in the canal or activated for 30 seconds using manual (gutta-percha), sonic and ultrasonic methods. Images were then captured every second using an external camera. The percentages of residual biofilm were measured using image analysis software. The data were analysed using generalised linear mixed models. The greatest removal was associated with the ultrasonic group (66.76%) followed by sonic (45.49%), manual (43.97%), and passive irrigation group (control) (38.67%) respectively. No marked reduction in the efficiency of NaOCl to remove biofilm was found between the simple and complex anatomy models (p = 0.098). The removal efficacy of NaOCl on the biofilm was limited to the 1 mm level of the lateral canal. The agitation of NaOCl results in better penetration of the irrigant into the lateral canals. Ultrasonic agitation of NaOCl improved the removal of bacterial biofilm.

Keywords: 3D printing, biofilm, root canal irrigation, sodium hypochlorite

Procedia PDF Downloads 220
24997 Lifelong Learning in Applied Fields (LLAF) Tempus Funded Project: A Case Study of Problem-Based Learning

Authors: Nirit Raichel, Dorit Alt

Abstract:

Although university teaching is claimed to have a special task to support students in adopting ways of thinking and producing new knowledge anchored in scientific inquiry practices, it is argued that students' habits of learning are still overwhelmingly skewed toward passive acquisition of knowledge from authority sources rather than from collaborative inquiry activities. In order to overcome this critical inadequacy between current educational goals and instructional methods, the LLAF consortium is aimed at developing updated instructional practices that put a premium on adaptability to the emerging requirements of present society. LLAF has created a practical guide for teachers containing updated pedagogical strategies based on the constructivist approach for learning, arranged along Delors’ four theoretical ‘pillars’ of education: Learning to know, learning to do, learning to live together, and learning to be. This presentation will be limited to problem-based learning (PBL), as a strategy introduced in the second pillar. PBL leads not only to the acquisition of technical skills, but also allows the development of skills like problem analysis and solving, critical thinking, cooperation and teamwork, decision- making and self-regulation that can be transferred to other contexts. This educational strategy will be exemplified by a case study conducted in the pre-piloting stage of the project. The case describes a three-fold process implemented in a postgraduate course for in-service teachers, including: (1) learning about PBL (2) implementing PBL in the participants' classes, and (3) qualitatively assessing the contributions of PBL to students' outcomes. An example will be given regarding the ways by which PBL was applied and assessed in civic education for high-school students. Two 9th-grade classes have participated the study; both included several students with learning disability. PBL was applied only in one class whereas traditional instruction was used in the other. Results showed a robust contribution of PBL to students' affective and cognitive outcomes as reflected in their motivation to engage in learning activities, and to further explore the subject. However, students with learning disability were less favorable with this "active" and "annoying" environment. Implications of these findings for the LLAF project will be discussed.

Keywords: problem-based learning, higher education, pedagogical strategies

Procedia PDF Downloads 327
24996 Coordination of Traffic Signals on Arterial Streets in Duhok City

Authors: Dilshad Ali Mohammed, Ziyad Nayef Shamsulddin Aldoski, Millet Salim Mohammed

Abstract:

The increase in levels of traffic congestion along urban signalized arterials needs efficient traffic management. The application of traffic signal coordination can improve the traffic operation and safety for a series of signalized intersection along the arterials. The objective of this study is to evaluate the benefits achievable through actuated traffic signal coordination and make a comparison in control delay against the same signalized intersection in case of being isolated. To accomplish this purpose, a series of eight signalized intersections located on two major arterials in Duhok City was chosen for conducting the study. Traffic data (traffic volumes, link and approach speeds, and passenger car equivalent) were collected at peak hours. Various methods had been used for collecting data such as video recording technique, moving vehicle method and manual methods. Geometric and signalization data were also collected for the purpose of the study. The coupling index had been calculated to check the coordination attainability, and then time space diagrams were constructed representing one-way coordination for the intersections on Barzani and Zakho Streets, and others represented two-way coordination for the intersections on Zakho Street with accepted progression bandwidth efficiency. The results of this study show great progression bandwidth of 54 seconds for east direction coordination and 17 seconds for west direction coordination on Barzani Street under suggested controlled speed of 60 kph agreeable with the present data. For Zakho Street, the progression bandwidth is 19 seconds for east direction coordination and 18 seconds for west direction coordination under suggested controlled speed of 40 kph. The results show that traffic signal coordination had led to high reduction in intersection control delays on both arterials.

Keywords: bandwidth, congestion, coordination, traffic, signals, streets

Procedia PDF Downloads 292
24995 Computer-Aided Detection of Liver and Spleen from CT Scans using Watershed Algorithm

Authors: Belgherbi Aicha, Bessaid Abdelhafid

Abstract:

In the recent years a great deal of research work has been devoted to the development of semi-automatic and automatic techniques for the analysis of abdominal CT images. The first and fundamental step in all these studies is the semi-automatic liver and spleen segmentation that is still an open problem. In this paper, a semi-automatic liver and spleen segmentation method by the mathematical morphology based on watershed algorithm has been proposed. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological to extract the liver and spleen. The second step consists to improve the quality of the image gradient. In this step, we propose a method for improving the image gradient to reduce the over-segmentation problem by applying the spatial filters followed by the morphological filters. Thereafter we proceed to the segmentation of the liver, spleen. The aim of this work is to develop a method for semi-automatic segmentation liver and spleen based on watershed algorithm, improve the accuracy and the robustness of the liver and spleen segmentation and evaluate a new semi-automatic approach with the manual for liver segmentation. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work. The system has been evaluated by computing the sensitivity and specificity between the semi-automatically segmented (liver and spleen) contour and the manually contour traced by radiological experts. Liver segmentation has achieved the sensitivity and specificity; sens Liver=96% and specif Liver=99% respectively. Spleen segmentation achieves similar, promising results sens Spleen=95% and specif Spleen=99%.

Keywords: CT images, liver and spleen segmentation, anisotropic diffusion filter, morphological filters, watershed algorithm

Procedia PDF Downloads 317
24994 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT

Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez

Abstract:

Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.

Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management

Procedia PDF Downloads 132
24993 Implementation Principles and Strategies of Bilingual Teaching in Taiwan

Authors: Chinfen Chen

Abstract:

This paper aims to focus on the challenges and doubts encountered in the implementation of ‘bilingual teaching in some fields of courses’, and propose implementation principles and strategies from the four areas of curriculum design, teaching strategies, teaching language application, and bilingual teaching implementation and operation, as a school The administrative team considers when planning bilingual teaching and also clarifies teachers' doubts about the implementation of bilingual teaching to enhance their willingness and confidence to participate in bilingual teaching.

Keywords: bilingual education policy, language immersion, partial bilingual education, content knowledge and target language acquisition, inquiry-based teaching.

Procedia PDF Downloads 42
24992 Research of Data Cleaning Methods Based on Dependency Rules

Authors: Yang Bao, Shi Wei Deng, WangQun Lin

Abstract:

This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.

Keywords: data cleaning, dependency rules, violation data discovery, data repair

Procedia PDF Downloads 558
24991 Challenge Response-Based Authentication for a Mobile Voting System

Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono

Abstract:

A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.

Keywords: authentication, data protection, mobile voting, security

Procedia PDF Downloads 408
24990 Hydroinformatics of Smart Cities: Real-Time Water Quality Prediction Model Using a Hybrid Approach

Authors: Elisa Coraggio, Dawei Han, Weiru Liu, Theo Tryfonas

Abstract:

Water is one of the most important resources for human society. The world is currently undergoing a wave of urban growth, and pollution problems are of a great impact. Monitoring water quality is a key task for the future of the environment and human species. In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for environmental monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the artificial intelligence algorithm. This study derives the methodology and demonstrates its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for the environment monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a new methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the Artificial Intelligence algorithm. This study derives the methodology and demonstrate its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.

Keywords: artificial intelligence, hydroinformatics, numerical modelling, smart cities, water quality

Procedia PDF Downloads 174
24989 Dialogic Approaches to Writing Pedagogy

Authors: Yael Leibovitch

Abstract:

Teaching academic writing is a source of concern for secondary schools. Many students struggle to meet the basic standards of literacy while teacher confidence in this arena remains low. These issues are compounded by the conventionally prescriptive character of writing instruction, which fails to engage student writers. At the same time, a growing body of research on dialogic teaching has highlighted the powerful role of talk in student learning. With the intent of enhancing pedagogical capability, this paper shares finding from a co-inquiry case study that investigated how teachers think about and negotiate classroom discourse to position students as effective academic writers and thinkers. Using a range of qualitative methods, this project closely documents the iterative collaboration of educators as they sought to create more opportunities for dialogic engagement. More specifically, it triangulates both teacher and student data regarding the efficacy of interdependent thinking and collaborative reasoning as organizing principals for literacy learning. Findings indicate that a dialogic teaching repertoire helps to develop the cognitive and metacognitive skills of adolescent writers. In addition, they underscore the importance of sustained professional collaboration to the uptake of new writing pedagogies.

Keywords: dialogic teaching, writing, teacher professional development, student literacy

Procedia PDF Downloads 209
24988 Smashed Mirror: Immigrant Students’ Constructions of South Africa

Authors: Vandeyar Saloshna, Vandeyar Hirusellvan

Abstract:

The image of post-apartheid South African Society that is reflected in the social mirror of the world is largely one of hope, faith, and aspiration. But is this reality? Utilizing social constructivism, case study approach and narrative inquiry, this chapter set out to explore the reflection of South African students from the lens of immigrant students. The picture that unfolds is troublesome in its negativity. In this chapter, we establish in detail what this picture is about and what implications it holds for South African Society.

Keywords: immigrant students, social mirror, xenophobia, identity formation, makwerekwere, expectations

Procedia PDF Downloads 443
24987 Skills and Abilities Expected from Professionals Conducting Serious Crimes Investigations: A Descriptive Study from Turkey

Authors: Burak M. Gonultas

Abstract:

Criminal investigation provides a practical contribution to this process while criminology provides a theoretical background in the apprehension of criminals arrest and clarification of crimes. However, studies on criminal investigation, which is a practical aspect of this process, are not sufficient. Every crime involves different dynamics in terms of investigation. But investigations of serious crimes are versatile and contains complex processes because of cases they are conducted. Therefore, professionals who conduct serious crime investigations differ in some aspects from others in the field. The most fundamental element of this differentiation is skills and abilities of these professionals. According to Eurostat data, Turkey is in an important position in terms of homicide rates. Therefore, in Turkey practice of serious crime investigation is specialized. The present study aims to research the skills and abilities expected from professionals in conducting an effective serious criminal investigation in Turkey and so aims to offer a number of suggestions. 25 emerged ability and skills collected from literature were asked to professionals (n=289) with semi-structured form according to 5 provinces with the highest and 2 provinces with the lowest number of serious crime cases. Three data categories were collected during experience: 1- Five most important skills and abilities, 2- The most important skills for knowledge and inquiry management and 3- Ability and skills that stand out for five stages of serious criminal investigation. The most rated skills and abilities are investigative skill (13%, n=134), planning/designing (9,2%, n=95) and interpersonal relations/communication (8,8%, n=91) in 1010 skills and abilities. While the 1st and 2nd suggest elections of these professionals, the 3rd also suggests how and what type of training will be given to these professionals. This practice differs from other studies in the area in terms of separately addressing the skills and abilities expected in stages of investigation and in terms of selected methodology.

Keywords: ability, criminal investigation, criminology, homicide, serious crimes, skill, Turkey

Procedia PDF Downloads 271
24986 Behavior of Engineering Students in Kuwait University

Authors: Mohammed A. Al-Ajmi, Reem S. Al-Kandari

Abstract:

This study is concerned with the behavior of engineering students in Kuwait University which became a concern due to the global issues of education in all levels. A survey has been conducted to identify academic and societal issues that affect the engineering student performance through. The study is drawing major conclusions with regard to private tutoring and the online availability of textbooks’ solution manuals.

Keywords: solution manual, engineering, textbook, ethics

Procedia PDF Downloads 484
24985 Number Variation of the Personal Pronoun We in American Spoken English

Authors: Qiong Hu, Ming Yue

Abstract:

Language variation signals the newest usage of language community, which might become the developmental trend of that language. The personal pronoun we is prescribed as a plural pronoun in grammar, but its number value is more flexible in actual use. Based on the homemade Friends corpus, the present research explores the number value of the first person pronoun we in nowadays American spoken English. With consideration of the subjectivity of we, this paper used ‘we+ PCU (Perception-cognation-utterance) verbs’ collocations and ‘we+ plural categories’ as the parameters. Results from corpus data and manual annotation show that: 1) the overall frequency of we has been increasing; 2) we has been increasingly used with other plural categories, indicating a weakening of its plural reference; and 3) we has been increasingly used with PCU (perception-cognition-utterance) verbs of strong subjectivity, indicating a strengthening of its singular reference. All these seem to support our hypothesis that we is undergoing the process of further grammaticalization towards a singular reference, though future evidence is needed to attest the bold prediction.

Keywords: number, PCU verbs, personal pronoun we,

Procedia PDF Downloads 226
24984 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 163
24983 Development of an Interactive and Robust Image Analysis and Diagnostic Tool in R for Early Detection of Cervical Cancer

Authors: Kumar Dron Shrivastav, Ankan Mukherjee Das, Arti Taneja, Harpreet Singh, Priya Ranjan, Rajiv Janardhanan

Abstract:

Cervical cancer is one of the most common cancer among women worldwide which can be cured if detected early. Manual pathology which is typically utilized at present has many limitations. The current gold standard for cervical cancer diagnosis is exhaustive and time-consuming because it relies heavily on the subjective knowledge of the oncopathologists which leads to mis-diagnosis and missed diagnosis resulting false negative and false positive. To reduce time and complexities associated with early diagnosis, we require an interactive diagnostic tool for early detection particularly in developing countries where cervical cancer incidence and related mortality is high. Incorporation of digital pathology in place of manual pathology for cervical cancer screening and diagnosis can increase the precision and strongly reduce the chances of error in a time-specific manner. Thus, we propose a robust and interactive cervical cancer image analysis and diagnostic tool, which can categorically process both histopatholgical and cytopathological images to identify abnormal cells in the least amount of time and settings with minimum resources. Furthermore, incorporation of a set of specific parameters that are typically referred to for identification of abnormal cells with the help of open source software -’R’ is one of the major highlights of the tool. The software has the ability to automatically identify and quantify the morphological features, color intensity, sensitivity and other parameters digitally to differentiate abnormal from normal cells, which may improve and accelerate screening and early diagnosis, ultimately leading to timely treatment of cervical cancer.

Keywords: cervical cancer, early detection, digital Pathology, screening

Procedia PDF Downloads 169
24982 Fecal Immunochemical Testing to Deter Colon Cancer

Authors: Valerie A. Conrade

Abstract:

Introduction: A large body of literature suggests patients who complete fecal immunochemical testing (FIT) kits are likely to identify colorectal cancer sooner than those who do not complete FIT kits. Background: Patients who do not participate in preventative measures such as the FIT kit are at a higher risk of colorectal cancer growing unnoticed. The objective was to see if the method the principal investigator (PI) uses to educate clinical staff on the importance of FIT kit administration provides an increased amount of FIT kit dissemination to patients post clinical education. Methodologies: Data collection via manual tallies took place before and after the clinical staff was educated on the importance of FIT kits. Results: The results showed an increase in FIT kit dissemination post clinical staff education. Through enhanced instruction to the clinical staff regarding the importance of FIT kits, expanding their knowledge on preventative measures to detect colorectal cancer positively impacted nurses and, in turn, their patients.

Keywords: colon cancer, education, fecal immunochemical testing, nursing

Procedia PDF Downloads 127
24981 Computer-Aided Detection of Simultaneous Abdominal Organ CT Images by Iterative Watershed Transform

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

Interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis applications. Segmentation of liver, spleen and kidneys is regarded as a major primary step in the computer-aided diagnosis of abdominal organ diseases. In this paper, a semi-automated method for medical image data is presented for the abdominal organ segmentation data using mathematical morphology. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. Our algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter, we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.

Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, simultaneous organ segmentation, the watershed algorithm

Procedia PDF Downloads 429
24980 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 395
24979 Narrative Research in Secondary Teacher Education: Examining the Self-Efficacy of Content Area Teacher Candidates

Authors: Tiffany Karalis Noel

Abstract:

The purpose of this study was to examine the factors attributed to the self-efficacy of beginning secondary content area teachers as they moved through their student teaching experiences. This study used a narrative inquiry methodology to understand the variables attributed to teacher self-efficacy among a group of secondary content area teacher candidates. The primary purpose of using a narrative inquiry methodology was to share the stories of content area teacher candidates’ student teaching experiences. Focused research questions included: (1) To what extent does teacher education preparation affect the self-efficacy of beginning content area teachers? (2) Which recurrent elements of teacher education affect the self-efficacy of beginning teachers, regardless of content area? (3) How do the findings from research questions 1 and 2 inform teacher educators? The findings of this study suggest that teacher education preparation affects the self-efficacy of beginning secondary teacher candidates across the content areas; accordingly, the findings of this study provide insight for teacher educators to consider the areas where teacher education programs are failing to provide adequate preparation. These teacher candidates emphasized the value of adequate preparation throughout their teacher education programs to help inform their student teaching experiences. In order to feel effective and successful as beginning teachers, these teacher candidates required additional opportunities to apply the practical application of their teaching skills prior to the student teaching experience, the incorporation of classroom management strategy coursework into their curriculum, and opportunities to explore the extensive demands of the teaching profession ranging from time management to dealing with difficult parents, to name a few referenced examples. The teacher candidates experienced feelings of self-doubt related to their effectiveness as teachers when they were unable to employ successful classroom management strategies, pedagogical techniques, or even feel confidence in navigating challenging conversations with students, parents, and/or administrators. In order to help future teacher candidates and beginning teachers in general overcome these barriers, additional coursework, fieldwork, and practical application experiences should be provided in teacher education programs to help boost the self-efficacy of student teachers.

Keywords: self-efficacy, teacher efficacy, secondary preservice teacher education, teacher candidacy, student teaching

Procedia PDF Downloads 147
24978 Computer Science and Mathematics Collaborating to Create New Educational Opportunities While Developing Interactive Calculus Apps

Authors: R. Pargas, M. Reba

Abstract:

Since 2006, the School of Computing and the Department of Mathematical Sciences have collaborated on several industry and NSF grants to develop new uses of technology in teaching and learning. Clemson University’s Creative Inquiry Program allowed computer science and mathematics students to earn credit each semester for participating in seminars which introduced them to new areas for independent research. We will discuss how the development of three interactive instructional apps for Calculus resulted not only in a useful product, but also in unique educational benefits for both the computer science students and the mathematics students, graduate and undergraduate, involved in the development process.

Keywords: calculus, apps, programming, mathematics

Procedia PDF Downloads 397
24977 The “Bright Side” of COVID-19: Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac Owusu Asante, Yushi Jiang, Hailin Tao

Abstract:

Live streaming marketing, the new electronic commerce element, became an optional marketing channel following the COVID-19 pandemic. Many sellers have leveraged the features presented by live streaming to increase sales. Studies on live streaming have focused on gaming and consumers’ loyalty to brands through live streaming, using interview questionnaires. This study, however, was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during live streaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study introduces a new way of measuring interactions in live streaming commerce and proposes a way to manually gather data on consumer behaviors in live streaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness

Procedia PDF Downloads 68
24976 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs

Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres

Abstract:

Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.

Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval

Procedia PDF Downloads 83
24975 Hyper Parameter Optimization of Deep Convolutional Neural Networks for Pavement Distress Classification

Authors: Oumaima Khlifati, Khadija Baba

Abstract:

Pavement distress is the main factor responsible for the deterioration of road structure durability, damage vehicles, and driver comfort. Transportation agencies spend a high proportion of their funds on pavement monitoring and maintenance. The auscultation of pavement distress was based on the manual survey, which was extremely time consuming, labor intensive, and required domain expertise. Therefore, the automatic distress detection is needed to reduce the cost of manual inspection and avoid more serious damage by implementing the appropriate remediation actions at the right time. Inspired by recent deep learning applications, this paper proposes an algorithm for automatic road distress detection and classification using on the Deep Convolutional Neural Network (DCNN). In this study, the types of pavement distress are classified as transverse or longitudinal cracking, alligator, pothole, and intact pavement. The dataset used in this work is composed of public asphalt pavement images. In order to learn the structure of the different type of distress, the DCNN models are trained and tested as a multi-label classification task. In addition, to get the highest accuracy for our model, we adjust the structural optimization hyper parameters such as the number of convolutions and max pooling, filers, size of filters, loss functions, activation functions, and optimizer and fine-tuning hyper parameters that conclude batch size and learning rate. The optimization of the model is executed by checking all feasible combinations and selecting the best performing one. The model, after being optimized, performance metrics is calculated, which describe the training and validation accuracies, precision, recall, and F1 score.

Keywords: distress pavement, hyperparameters, automatic classification, deep learning

Procedia PDF Downloads 79
24974 The Problem of Access to Water, Sanitation and Hygiene in Small Island Towns: The Case of Foundiougne in Senegal

Authors: El Hadji Mamadou Sonko, Ndiogou Sankhare, Maïmouna Lo, Jean Birane Gning, Cheikh Diop

Abstract:

In Senegal, access to water, hygiene, and sanitation in small island towns is a particular problem, which is still poorly understood by the public authorities and development aid actors. The main objective of this study carried out in the Municipality of Foundiougne is to contribute to the knowledge of the problems related to the supply of drinking water, access to sanitation, and hygiene in small island towns in Senegal. The methodology adopted consisted of a literature review and quantitative surveys of a sample of 100 households in the Municipality. Semi-structured interviews using interview guides and informal interviews were also conducted with mechanical and manual emptiers, municipal authorities, public toilet managers, and neighbourhood leaders. Direct observation with photography was also used. The results show that, with regard to access to drinking water, 35% of households have unimproved water services, 46% have a limited level of service, and 19% have a basic level of service. Regarding sanitation, 77% of households are considered to have access to basic sanitation services, compared to 23% with limited sanitation services. However, these figures hide the dysfunctions of the sanitation system. Indeed, manual emptying is practiced exclusively by 4% of households, while 17% of households combine it with mechanical emptying. In addition, domestic wastewater is mainly evacuated outside the sanitation facilities, and all the sludge extracted from the pits is discharged directly into the environment without treatment. As a matter of fact, the surveys showed that 52% of households do not have access to a basic level of hygiene-related to handwashing when leaving the toilet. These results show that there is real work to be done at the level of small urban centres if we want to achieve MDG 6.

Keywords: Foundiougne, Senegal, small island, small town, water-sanitation, hygiene

Procedia PDF Downloads 84