Search results for: offensive language detection
1247 Submarine Topography and Beach Survey of Gang-Neung Port in South Korea, Using Multi-Beam Echo Sounder and Shipborne Mobile Light Detection and Ranging System
Authors: Won Hyuck Kim, Chang Hwan Kim, Hyun Wook Kim, Myoung Hoon Lee, Chan Hong Park, Hyeon Yeong Park
Abstract:
We conducted submarine topography & beach survey from December 2015 and January 2016 using multi-beam echo sounder EM3001(Kongsberg corporation) & Shipborne Mobile LiDAR System. Our survey area were the Anmok beach in Gangneung, South Korea. We made Shipborne Mobile LiDAR System for these survey. Shipborne Mobile LiDAR System includes LiDAR (RIEGL LMS-420i), IMU ((Inertial Measurement Unit, MAGUS Inertial+) and RTKGNSS (Real Time Kinematic Global Navigation Satellite System, LEIAC GS 15 GS25) for beach's measurement, LiDAR's motion compensation & precise position. Shipborne Mobile LiDAR System scans beach on the movable vessel using the laser. We mounted Shipborne Mobile LiDAR System on the top of the vessel. Before beach survey, we conducted eight circles IMU calibration survey for stabilizing heading of IMU. This exploration should be as close as possible to the beach. But our vessel could not come closer to the beach because of latency objects in the water. At the same time, we conduct submarine topography survey using multi-beam echo sounder EM3001. A multi-beam echo sounder is a device observing and recording the submarine topography using sound wave. We mounted multi-beam echo sounder on left side of the vessel. We were equipped with a motion sensor, DGNSS (Differential Global Navigation Satellite System), and SV (Sound velocity) sensor for the vessel's motion compensation, vessel's position, and the velocity of sound of seawater. Shipborne Mobile LiDAR System was able to reduce the consuming time of beach survey rather than previous conventional methods of beach survey.Keywords: Anmok, beach survey, Shipborne Mobile LiDAR System, submarine topography
Procedia PDF Downloads 4271246 Developing English L2 Critical Reading and Thinking Skills through the PISA Reading Literacy Assessment Framework: A Case Study of EFL Learners in a Thai University
Authors: Surasak Khamkhong
Abstract:
This study aimed to investigate the use of the PISA reading literacy assessment framework (PRF) to improve EFL learners’ critical reading and thinking skills. The sample group, selected by the purposive sampling technique, included 36 EFL learners from a university in Northeastern Thailand. The instruments consisted of 8 PRF-based reading lessons, a 27-item-PRF-based reading test which was used as a pre-test and a post-test, and an attitude questionnaire toward the designed lessons. The statistics used for data analysis were percentage, mean, standard deviation, and the Wilcoxon signed-rank test. The results revealed that before the intervention, the students’ English reading proficiency were low as is evident from their low pre-test scores (M=14.00). They did fairly well for the access-and-retrieve questions (M=6.11), but poorly for the integrate-and-interpret questions (M=4.89) and the reflect-and-evaluate questions (M=3.00), respectively. This means that the students could comprehend the texts but they could hardly interpret or evaluate them. However, after the intervention, they could do better as their post-test scores were higher (M=18.01). They could comprehend (M=6.78), interpret (M=6.00) and evaluate (M=5.25) well. This means that after the intervention, their critical reading skills had improved. In terms of their attitude towards the designed lessons and instruction, most students were satisfied with the lessons and the instruction. It may thus be concluded that the designed lessons can help improve students’ English critical reading proficiency and may be used as a teaching model for improving EFL learners’ critical reading skills.Keywords: second language reading, critical reading and thinking skills, PISA reading literacy framework, English L2 reading development
Procedia PDF Downloads 1881245 Seashore Debris Detection System Using Deep Learning and Histogram of Gradients-Extractor Based Instance Segmentation Model
Authors: Anshika Kankane, Dongshik Kang
Abstract:
Marine debris has a significant influence on coastal environments, damaging biodiversity, and causing loss and damage to marine and ocean sector. A functional cost-effective and automatic approach has been used to look up at this problem. Computer vision combined with a deep learning-based model is being proposed to identify and categorize marine debris of seven kinds on different beach locations of Japan. This research compares state-of-the-art deep learning models with a suggested model architecture that is utilized as a feature extractor for debris categorization. The model is being proposed to detect seven categories of litter using a manually constructed debris dataset, with the help of Mask R-CNN for instance segmentation and a shape matching network called HOGShape, which can then be cleaned on time by clean-up organizations using warning notifications of the system. The manually constructed dataset for this system is created by annotating the images taken by fixed KaKaXi camera using CVAT annotation tool with seven kinds of category labels. A pre-trained HOG feature extractor on LIBSVM is being used along with multiple templates matching on HOG maps of images and HOG maps of templates to improve the predicted masked images obtained via Mask R-CNN training. This system intends to timely alert the cleanup organizations with the warning notifications using live recorded beach debris data. The suggested network results in the improvement of misclassified debris masks of debris objects with different illuminations, shapes, viewpoints and litter with occlusions which have vague visibility.Keywords: computer vision, debris, deep learning, fixed live camera images, histogram of gradients feature extractor, instance segmentation, manually annotated dataset, multiple template matching
Procedia PDF Downloads 1041244 Forensic Medical Capacities of Research of Saliva Stains on Physical Evidence after Washing
Authors: Saule Mussabekova
Abstract:
Recent advances in genetics have allowed increasing acutely the capacities of the formation of reliable evidence in conducting forensic examinations. Thus, traces of biological origin are important sources of information about a crime. Currently, around the world, sexual offenses have increased, and among them are those in which the criminals use various detergents to remove traces of their crime. A feature of modern synthetic detergents is the presence of biological additives - enzymes. Enzymes purposefully destroy stains of biological origin. To study the nature and extent of the impact of modern washing powders on saliva stains on the physical evidence, specially prepared test specimens of different types of tissues to which saliva was applied have been examined. Materials and Methods: Washing machines of famous manufacturers of household appliances have been used with different production characteristics and advertised brands of washing powder for test washing. Over 3,500 experimental samples were tested. After washing, the traces of saliva were identified using modern research methods of forensic medicine. Results: The influence was tested and the dependence of the use of different washing programs, types of washing machines and washing powders in the process of establishing saliva trace and identify of the stains on the physical evidence while washing was revealed. The results of experimental and practical expert studies have shown that in most cases it is not possible to draw the conclusions in the identification of saliva traces on physical evidence after washing. This is a consequence of the effect of biological additives and other additional factors on traces of saliva during washing. Conclusions: On the basis of the results of the study, the feasibility of saliva traces of the stains on physical evidence after washing is established. The use of modern molecular genetic methods makes it possible to partially solve the problems arising in the study of unlaundered evidence. Additional study of physical evidence after washing facilitates detection and investigation of sexual offenses against women and children.Keywords: saliva research, modern synthetic detergents, laundry detergents, forensic medicine
Procedia PDF Downloads 2151243 Highly Responsive p-NiO/n-rGO Heterojunction Based Self-Powered UV Photodetectors
Authors: P. Joshna, Souvik Kundu
Abstract:
Detection of ultraviolet (UV) radiation is very important as it has exhibited a profound influence on humankind and other existences, including military equipment. In this work, a self-powered UV photodetector was reported based on oxides heterojunctions. The thin films of p-type nickel oxide (NiO) and n-type reduced graphene oxide (rGO) were used for the formation of p-n heterojunction. Low-Cost and low-temperature chemical synthesis was utilized to prepare the oxides, and the spin coating technique was employed to deposit those onto indium doped tin oxide (ITO) coated glass substrates. The top electrode platinum was deposited utilizing physical vapor evaporation technique. NiO offers strong UV absorption with high hole mobility, and rGO prevents the recombination rate by separating electrons out from the photogenerated carriers. Several structural characterizations such as x-ray diffraction, atomic force microscope, scanning electron microscope were used to study the materials crystallinity, microstructures, and surface roughness. On one side, the oxides were found to be polycrystalline in nature, and no secondary phases were present. On the other side, surface roughness was found to be low with no pit holes, which depicts the formation of high-quality oxides thin films. Whereas, x-ray photoelectron spectroscopy was employed to study the chemical compositions and oxidation structures. The electrical characterizations such as current-voltage and current response were also performed on the device to determine the responsivity, detectivity, and external quantum efficiency under dark and UV illumination. This p-n heterojunction device offered faster photoresponse and high on-off ratio under 365 nm UV light illumination of zero bias. The device based on the proposed architecture shows the efficacy of the oxides heterojunction for efficient UV photodetection under zero bias, which opens up a new path towards the development of self-powered photodetector for environment and health monitoring sector.Keywords: chemical synthesis, oxides, photodetectors, spin coating
Procedia PDF Downloads 1221242 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions
Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier
Abstract:
Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).Keywords: dispersibility, stability, Hansen parameters, particles, solvents
Procedia PDF Downloads 1071241 Resting-State Functional Connectivity Analysis Using an Independent Component Approach
Authors: Eric Jacob Bacon, Chaoyang Jin, Dianning He, Shuaishuai Hu, Lanbo Wang, Han Li, Shouliang Qi
Abstract:
Objective: Refractory epilepsy is a complicated type of epilepsy that can be difficult to diagnose. Recent technological advancements have made resting-state functional magnetic resonance (rsfMRI) a vital technique for studying brain activity. However, there is still much to learn about rsfMRI. Investigating rsfMRI connectivity may aid in the detection of abnormal activities. In this paper, we propose studying the functional connectivity of rsfMRI candidates to diagnose epilepsy. Methods: 45 rsfMRI candidates, comprising 26 with refractory epilepsy and 19 healthy controls, were enrolled in this study. A data-driven approach known as independent component analysis (ICA) was used to achieve our goal. First, rsfMRI data from both patients and healthy controls were analyzed using group ICA. The components that were obtained were then spatially sorted to find and select meaningful ones. A two-sample t-test was also used to identify abnormal networks in patients and healthy controls. Finally, based on the fractional amplitude of low-frequency fluctuations (fALFF), a chi-square statistic test was used to distinguish the network properties of the patient and healthy control groups. Results: The two-sample t-test analysis yielded abnormal in the default mode network, including the left superior temporal lobe and the left supramarginal. The right precuneus was found to be abnormal in the dorsal attention network. In addition, the frontal cortex showed an abnormal cluster in the medial temporal gyrus. In contrast, the temporal cortex showed an abnormal cluster in the right middle temporal gyrus and the right fronto-operculum gyrus. Finally, the chi-square statistic test was significant, producing a p-value of 0.001 for the analysis. Conclusion: This study offers evidence that investigating rsfMRI connectivity provides an excellent diagnosis option for refractory epilepsy.Keywords: ICA, RSN, refractory epilepsy, rsfMRI
Procedia PDF Downloads 751240 Attitude and Knowledge of Primary Health Care Physicians and Local Inhabitants about Leishmaniasis and Sandfly in West Alexandria, Egypt
Authors: Randa M. Ali, Naguiba F. Loutfy, Osama M. Awad
Abstract:
Background: Leishmaniasis is a worldwide disease, affecting 88 countries, it is estimated that about 350 million people are at risk of leishmaniasis. Overall prevalence is 12 million people with annual mortality of about 60,000. Annual incidence is 1,500,000 cases of cutaneous leishmaniasis (CL) worldwide and half million cases of visceral Leishmaniasis (VL). Objectives: The objective of this study was to assess primary health care physicians knowledge (PHP) and attitude about leishmaniasis and to assess awareness of local inhabitants about the disease and its vector in four areas in west Alexandria, Egypt. Methods: This study was a cross sectional survey that was conducted in four PHC units in west Alexandria. All physicians currently working in these units during the study period were invited to participate in the study, only 20 PHP completed the questionnaire. 60 local inhabitant were selected randomly from the four areas of the study, 15 from each area; Data was collected through two different specially designed questionnaires. Results: 11(55%) percent of the physicians had satisfactory knowledge, they answered more than 9 (60%) questions out of a total 14 questions about leishmaniasis and sandfly. The second part of the questionnaire is concerned with attitude of the primary health care physicians about leishmaniasis, 17 (85%) had good attitude and 3 (15%) had poor attitude. The second questionnaire showed that the awareness of local inhabitants about leishmaniasis and sandly as a vector of the disease is poor and needs to be corrected. Most of the respondents (90%) had not heard about leishmaniasis, Only 3 (5%) of the interviewed inhabitants said they know sandfly and its role in transmission of leishmaniasis. Conclusions: knowledge and attitudes of physicians are acceptable. However, there is, room for improvement and could be done through formal training courses and distribution of guidelines. In addition to raising the awareness of primary health care physicians about the importance of early detection and notification of cases of lesihmaniasis. Moreover, health education for raising awareness of the public regarding the vector and the disease is necessary because related studies have demonstrated that if the inhabitants do not perceive mosquitoes to be responsible for diseases such as malaria they do not take enough measures to protect themselves against the vector.Keywords: leishmaniasis, PHP, knowledge, attitude, local inhabitants
Procedia PDF Downloads 4461239 A Collective Intelligence Approach to Safe Artificial General Intelligence
Authors: Craig A. Kaplan
Abstract:
If AGI proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest, and fastest, path to Artificial General Intelligence (AGI) may be to harness the collective intelligence of multiple AI and human agents in an AGI network. This approach has roots in seminal ideas from four of the scientists who founded the field of Artificial Intelligence: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon. Extrapolating key insights from these founders of AI, and combining them with the work of modern researchers, results in a fast and safe path to AGI. The seminal ideas discussed are: 1) Society of Mind (Minsky), 2) Information Theory (Shannon), 3) Problem Solving Theory (Newell & Simon), and 4) Bounded Rationality (Simon). Society of Mind describes a collective intelligence approach that can be used with AI and human agents to create an AGI network. Information theory helps address the critical issue of how an AGI system will increase its intelligence over time. Problem Solving Theory provides a universal framework that AI and human agents can use to communicate efficiently, effectively, and safely. Bounded Rationality helps us better understand not only the capabilities of SuperIntelligent AGI but also how humans can remain relevant in a world where the intelligence of AGI vastly exceeds that of its human creators. Each key idea can be combined with recent work in the fields of Artificial Intelligence, Machine Learning, and Large Language Models to accelerate the development of a working, safe, AGI system.Keywords: AI Agents, Collective Intelligence, Minsky, Newell, Shannon, Simon, AGI, AGI Safety
Procedia PDF Downloads 871238 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification
Authors: Hung-Sheng Lin, Cheng-Hsuan Li
Abstract:
Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction
Procedia PDF Downloads 3421237 Controlled Growth of Au Hierarchically Ordered Crystals Architectures for Electrochemical Detection of Traces of Molecules
Authors: P. Bauer, K. Mougin, V. Vignal, A. Buch, P. Ponthiaux, D. Faye
Abstract:
Nowadays, noble metallic nanostructures with unique morphology are widely used as new sensors due to their fascinating optical, electronic and catalytic properties. Among various shapes, dendritic nanostructures have attracted much attention because of their large surface-to-volume ratio, high sensitivity and special texture with sharp tips and nanoscale junctions. Several methods have been developed to fabricate those specific structures such as electrodeposition, photochemical way, seed-mediated growth or wet chemical method. The present study deals with a novel approach for a controlled growth pattern-directed organisation of Au flower-like crystals (NFs) deposited onto stainless steel plates to achieve large-scale functional surfaces. This technique consists in the deposition of a soft nanoporous template on which Au NFs are grown by electroplating and seed-mediated method. Size, morphology, and interstructure distance have been controlled by a site selective nucleation process. Dendritic Au nanostructures have appeared as excellent Raman-active candidates due to the presence of very sharp tips of multi-branched Au nanoparticles that leads to a large local field enhancement and a good SERS sensitivity. In addition, these structures have also been used as electrochemical sensors to detect traces of molecules present in a solution. A correlation of the number of active sites on the surface and the current charge by both colorimetric method and cyclic voltammetry of gold structures have allowed a calibration of the system. This device represents a first step for the fabrication of MEMs platform that could ultimately be integrated into a lab-on-chip system. It also opens pathways to several technologically large-scale nanomaterials fabrication such as hierarchically ordered crystal architectures for sensor applications.Keywords: dendritic, electroplating, gold, template
Procedia PDF Downloads 1851236 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors
Authors: Susana Aragoneses Garrido
Abstract:
Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.Keywords: analysis vehicles, asssesment, ergonomics, car redesign
Procedia PDF Downloads 3351235 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products
Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto
Abstract:
An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.Keywords: TDLAS, carbon dioxide, cups, headspace, measurement
Procedia PDF Downloads 3221234 Breaking Barriers: Utilizing Innovation to Improve Educational Outcomes for Students with Disabilities
Authors: Emily Purdom, Rachel Robinson
Abstract:
As the number of students worldwide requiring speech-language therapy, occupational therapy and mental health services during their school day increases, innovation is becoming progressively more important to meet the demand. Telepractice can be used to reach a greater number of students requiring specialized therapy while maintaining the highest quality of care. It can be provided in a way that is not only effective but ultimately more convenient for student, teacher and therapist without the added burden of travel. Teletherapy eradicates many hurdles to traditional on-site service delivery and helps to solve the pervasive shortage of certified professionals. Because location is no longer a barrier to specialized education plans for students with disabilities when teletherapy is conducted, there are many advantages that can be deployed. Increased frequency of engagement is possible along with students receiving specialized care from a clinician that may not be in their direct area. Educational teams, including parents, can work together more easily and engage in face-to-face, student-centered collaboration through videoconference. Practical strategies will be provided for connecting students with qualified therapists without the typical in-person dynamic. In most cases, better therapy outcomes are going to be achieved when treatment is most convenient for the student and educator. This workshop will promote discussion in the field of education to increase advocacy for remote service delivery. It will serve as a resource for those wanting to expand their knowledge of options for students with special needs afforded through innovation.Keywords: education technology, innovation, student support services, telepractice
Procedia PDF Downloads 2451233 The Factors Constitute the Interaction between Teachers and Students: An Empirical Study at the Notion of Framing
Authors: Tien-Hui Chiang
Abstract:
The code theory, proposed by Basil Bernstein, indicates that framing can be viewed as the core element in constituting the phenomenon of cultural reproduction because it is able to regulate the transmission of pedagogical information. Strong framing increases the social relation boundary between a teacher and pupils, which obstructs information transmission, so that in order to improve underachieving students’ academic performances, teachers need to reduce to strength of framing. Weak framing enables them to transform academic knowledge into commonsense knowledge in daily life language. This study posits that most teachers would deliver strong framing due to their belief mainly confined within the aspect of instrumental rationality that deprives their critical minds. This situation could make them view the normal distribution bell curve of students’ academic performances as a natural outcome. In order to examine the interplay between framing, instrumental rationality and pedagogical action, questionnaires were completed by over 5,000 primary school teachers in Henan province, China, who were stratified sample. The statistical results show that most teachers employed psychological concepts to measure students’ academic performances and, in turn, educational inequity was legitimatized as a natural outcome in the efficiency-led approach. Such efficiency-led minds made them perform as the agent practicing the mechanism of social control and in turn sustaining the phenomenon of cultural reproduction.Keywords: code, cultural reproduction, framing, instrumental rationality, social relation and interaction
Procedia PDF Downloads 1501232 A Systematic Snapshot of Software Outsourcing Challenges
Authors: Issam Jebreen, Eman Al-Qbelat
Abstract:
Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.
Procedia PDF Downloads 881231 Ideology and Lexicogrammar: Discourse Against the Power in Lyrical Texts (XIII, XVII and XX Centuries)
Authors: Ulisses Tadeu Vaz de Oliveira
Abstract:
The development of multifunctional studies in the theoretical-methodological perspective of the Systemic-Functional Grammar (SFG) and the increasing number of critical literary studies have introduced new opportunities for the study of ideologies and societies, but also brought up new challenges across and within many areas. In this regard, the Critical Linguistics researches allow a form of pairing a textual linguistic analysis method (micro level) with a social language theory in political and ideological processes (macro level), presented in the literature. This presentation will report on strategies to criticize power holders in literary productions from three distinct eras, namely: (a) Satirical Galego-Portuguese chants of Gil Pérez Conde (thirteenth century), (b) Poems of Gregorio de Matos Guerra (seventeenth century), and (c) Songs of Chico Buarque de Holanda (twentieth century). The analysis of these productions is based on the SFG proposals, which considers the clause as a social event. Therefore, the structure serves to realize three concurrent meanings (metafunctions): Ideational, Interpersonal and Textual. The presenter aims to shed light on the core issues relevant to the successes of the authors to criticize authorities in repressive times while caring about face-threatening and politeness. The effective and meaningful critical discourse was a way of moving the society`s chains towards new ideologies reflected in the lexicogrammatical choices made and the rhetorical functions of the persuasive structures used by the authors.Keywords: ideology, literature, persuasion, systemic-functional grammar
Procedia PDF Downloads 4161230 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory
Authors: Xu Jiaqiao
Abstract:
Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments
Procedia PDF Downloads 931229 Normalized Enterprises Architectures: Portugal's Public Procurement System Application
Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso
Abstract:
The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms
Procedia PDF Downloads 3561228 The Environmental Influence on Slow Learners' Learning Achievement
Authors: Niphattha Hannapha
Abstract:
This paper examines how the classroom environment influences slow learners’ learning achievement; it focuses on how seating patterns affect students’ behaviours and which patterns best contribute to students’ learning performance. The researcher studied how slow learners’ characteristics and seating patterns influenced their behaviours and performance at Ban Hin Lad School. As a nonparticipant observation, the target groups included 15 slow learners from Prathomsueksa (Grades) 4 and 5. Students’ behaviours were recorded during their learning activities in order to minimize their reading and written expression disorder in Thai language tutorials. The result showed four seating patterns and two behaviors which obstructed students’ learning. The average of both behaviours mostly occurred when students were seated with patterns 1 (the seat facing the door, with the corridor alongside) and 3 (the seat alongside the door, facing the aisle) respectively. Seating patterns 1 and 3 demonstrated visibility (the front and side) of a walking path with two-way movement. However, seating patterns 2 (seating with the door alongside and the aisle at the back) and 4 (sitting with the door at the back and the aisle alongside) demonstrated visibility (the side) of a walking path with one-way movement. In Summary, environmental design is important to enhance concentration in slow learners who have reading and writing disabilities. This study suggests that students should be seated where they can have the least visibility of movement to help them increase continuous learning. That means they can have a better chance of developing reading and writing abilities in comparison with other patterns of seating.Keywords: slow learning, interior design, interior environment, classroom
Procedia PDF Downloads 2121227 ‘Internationalize Yourself’: Mobility in Academia as a Form of Continuing Professional Training
Authors: Sonja Goegele, Petra Kletzenbauer
Abstract:
The FH JOANNEUM- a university of applied sciences based in Austria - cooperates in teaching and research with well-known international universities and thus aims to foster so-called strategic partnerships. The exchange of university lecturers and other faculty members is a way to achieve and secure strategic company goals, in which excellent research and teaching play a central role in order to improve both the development of academics and administration. Thanks to mobility not only the university but also the involved people truly benefit in their professional development which can be seen on several levels: increased foreign language proficiency, excellent networking possibilities within the scientific community as well as reinforced didactic competencies in the form of different teaching and learning methodologies. The paper discusses mobility in the light of the university’s strategic paper entitled ‘Hands on 2022’ by presenting results from an empirical research study among faculty members who participate in exchange programmes on a regular basis. In the form of an online questionnaire, mobility was discussed from different angles such as networking, collaborative research, professional training for academics and the overall impact of the exchange within and outside the organization. From the findings, it can be concluded that mobility is an asset for any university. However, keeping in constant dialogue with partner universities requires more than the purpose of the exchange itself. Building rapport and keeping a relationship of trust are challenges that need to be addressed more closely in order to run successful mobility programmes. Best Practice examples should highlight the importance of mobility as a vital initiative to transfer disciplines.Keywords: higher education, internationalization, mobility, strategic partnerships
Procedia PDF Downloads 1361226 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks
Authors: Ahmed Abdullah Ahmed
Abstract:
The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments
Procedia PDF Downloads 5101225 Elevated Creatinine Clearance and Normal Glomerular Filtration Rate in Patients with Systemic Lupus erythematosus
Authors: Stoyanka Vladeva, Elena Kirilova, Nikola Kirilov
Abstract:
Background: The creatinine clearance is a widely used value to estimate the GFR. Increased creatinine clearance is often called hyperfiltration and is usually seen during pregnancy, patients with diabetes mellitus preceding the diabetic nephropathy. It may also occur with large dietary protein intake or with plasma volume expansion. Renal injury in lupus nephritis is known to affect the glomerular, tubulointerstitial, and vascular compartment. However high creatinine clearance has not been found in patients with SLE, Target: Follow-up of creatinine clearance values in patients with systemic lupus erythematosus without history of kidney injury. Material and methods: We observed the creatinine, creatinine clearance, GFR and dipstick protein values of 7 women (with a mean age of 42.71 years) with systemic lupus erythematosus. Patients with active lupus have been monthly tested in the period of 13 months. Creatinine clearance has been estimated by Cockcroft-Gault Equation formula in ml/sec. GFR has been estimated by MDRD formula (The Modification of Diet in renal Disease) in ml/min/1.73 m2. Proteinuria has been defined as present when dipstick protein > 1+.Results: In all patients without history of kidney injury we found elevated creatinine clearance levels, but GFRremained within the reference range. Two of the patients were in remission while the other five patients had clinically and immunologically active Lupus. Three of the patients had a permanent presence of high creatinine clearance levels and proteinuria. Two of the patients had periodically elevated creatinine clearance without proteinuria. These results show that kidney disturbances may be caused by the vascular changes typical for SLE. Glomerular hyperfiltration can be result of focal segmental glomerulosclerosis caused by a reduction in renal mass. Probably lupus nephropathy is preceded not only by glomerular vascular changes, but also by tubular vascular changes. Using only the GFR is not a sufficient method to detect these primary functional disturbances. Conclusion: For early detection of kidney injury in patients with SLE we determined that the follow up of creatinine clearance values could be helpful.Keywords: systemic Lupus erythematosus, kidney injury, elevated creatinine clearance level, normal glomerular filtration rate
Procedia PDF Downloads 2691224 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 2981223 Modified Model-Based Systems Engineering Driven Approach for Defining Complex Energy Systems
Authors: Akshay S. Dalvi, Hazim El-Mounayri
Abstract:
The internal and the external interactions between the complex structural and behavioral characteristics of the complex energy system result in unpredictable emergent behaviors. These emergent behaviors are not well understood, especially when modeled using the traditional top-down systems engineering approach. The intrinsic nature of current complex energy systems has called for an elegant solution that provides an integrated framework in Model-Based Systems Engineering (MBSE). This paper mainly presents a MBSE driven approach to define and handle the complexity that arises due to emergent behaviors. The approach provides guidelines for developing system architecture that leverages in predicting the complexity index of the system at different levels of abstraction. A framework that integrates indefinite and definite modeling aspects is developed to determine the complexity that arises during the development phase of the system. This framework provides a workflow for modeling complex systems using Systems Modeling Language (SysML) that captures the system’s requirements, behavior, structure, and analytical aspects at both problem definition and solution levels. A system architecture for a district cooling plant is presented, which demonstrates the ability to predict the complexity index. The result suggests that complex energy systems like district cooling plant can be defined in an elegant manner using the unconventional modified MBSE driven approach that helps in estimating development time and cost.Keywords: district cooling plant, energy systems, framework, MBSE
Procedia PDF Downloads 1281222 Algerian EFL Students' Perceptions towards the Development of Writing through Weblog Storytelling
Authors: Nawel Mansouri
Abstract:
Weblog as a form of internet-based resources has become popular as an authentic and constructive learning tool, especially in the language classroom. This research explores the use of weblog storytelling as a pedagogical tool to develop Algerian EFL students’ creative writing. This study aims to investigate the effectiveness of weblog- writing and the attitudes of both Algerian EFL students and teachers towards weblog storytelling. It also seeks to explore the potential benefits and problems that may affect the use of weblog and investigate the possible solutions to overcome the problems encountered. The research work relies on a mixed-method approach which combines both qualitative and quantitative methods. A questionnaire will be applied to both EFL teachers and students as a means to obtain preliminary data. Interviews will be integrated in accordance with the primary data that will be gathered from the questionnaire with the aim of validating its accuracy or as a strategy to follow up any unexpected results. An intervention will take place on the integration of weblog- writing among 15 Algerian EFL students for a period of two months where students are required to write five narrative essays about their personal experiences, give feedback through the use of a rubric to two or three of their peers, and edit their work based on the feedback. After completion, questionnaires and interviews will also take place as a medium to obtain both the students’ perspectives towards the use of weblog as an innovative teaching approach. This study is interesting because weblog storytelling has recently been emerged as a new form of digital communication and it is a new concept within Algerian context. Furthermore, the students will not just develop their writing skill through weblog storytelling but it can also serve as a tool to develop students’ critical thinking, creativity, and autonomy.Keywords: Weblog writing, EFL writing, EFL learners' attitudes, EFL teachers' views
Procedia PDF Downloads 1731221 Augmented Reality for Children Vocabulary Learning: Case Study in a Macau Kindergarten
Authors: R. W. Chan, Kan Kan Chan
Abstract:
Augmented Reality (AR), with the affordance of bridging between real world and virtual world, brings users immersive experience. It has been applied in education gradually and even come into practice in student daily learning. However, a systematic review shows that there are limited researches in the area of vocabulary acquisition in early childhood education. Since kindergarten is a key stage where children acquire language and AR as an emerging and potential technology to support the vocabulary acquisition, this study aims to explore its value in in real classroom with teacher’s view. Participants were a class of 5 to 6 years old kids studying in a Macau school that follows Cambridge curriculum and emphasizes multicultural ethos. There were 11 boys, 13 girls, and in a total of 24 kids. They learnt animal vocabulary using mobile device and AR flashcards, IPad to scan AR flashcards and interact with pop-up virtual objects. In order to estimate the effectiveness of using Augmented Reality, children attended vocabulary pre-posttest. In addition, teacher interview was administrated after this learning activity to seek practitioner’s opinion towards this technology. For data analysis, paired samples t-test was utilized to measure the instructional effect based on the pre-posttest data. Result shows that Augmented Reality could significantly enhance children vocabulary learning with large effect size. Teachers indicated that children enjoyed the AR learning activity but clear instruction is needed. Suggestions for the future implementation of vocabulary acquisition using AR are suggested.Keywords: augmented reality, kindergarten children, vocabulary learning, Macau
Procedia PDF Downloads 1471220 Modelling Conceptual Quantities Using Support Vector Machines
Authors: Ka C. Lam, Oluwafunmibi S. Idowu
Abstract:
Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression
Procedia PDF Downloads 2041219 Effect of Phthalates on Male Infertility: Myth or Truth?
Authors: Rashmi Tomar, A. Srinivasan, Nayan K. Mohanty, Arun K. Jain
Abstract:
Phthalates have been used as additives in industrial products since the 1930s, and are universally considered to be ubiquitous environmental contaminants. The general population is exposed to phthalates through consumer products, as well as diet and medical treatments. Animal studies showing the existence of an association between some phthalates and testicular toxicity have generated public and scientific concern about the potential adverse effects of environmental changes on male reproductive health. Unprecedented declines in fertility rates and semen quality have been reported during the last half of the 20th century in developed countries and increasing interest exists on the potential relationship between exposure to environmental contaminants, including phthalates, and human male reproductive health Studies. Phthalates may be associated with altered endocrine function and adverse effects on male reproductive development and function, but human studies are limited. The aim of the present study was detection of phthalate compounds, estimation of their metabolites in infertile & fertile male. Blood and urine samples were collected from 150 infertile patients & 75 fertile volunteers recruited through Department of Urology, Safdarjung Hospital, New Delhi. Blood have been collected in separate glass tubes from the antecubital vein of the patients, serum have been separate and estimate the phthalate level in serum samples by Gas Chromatography / Mass Spectrometry using NIOSH / OSHA detailed protocol. Urine of Infertile & Fertile Subjects was collected & extracted using solid phase extraction method, analysis by HPLC. In conclusion, to the best of our knowledge the present study based on human is first to show the presence of phthalate in human serum samples and their metabolites in urine samples. Significant differences were observed between several phthalates in infertile and fertile healthy individuals.Keywords: Gas Chromatography, HPLC, male infertility, phthalates, serum, toxicity, urine
Procedia PDF Downloads 3631218 Impacts of Extremism and Terrorism on Modern Urdu Poetry: A Case Study of Khyber Pakhtunkhwa
Authors: Naqeeb Ahmad Jan, Rukhsana Bibi
Abstract:
Extremism is once again pushing the globe towards ignorance and darkness. In the present day, the wave of extremist element (tendencies) has affected the people across the globe which led them to believe in manifestation of various ideologies. Likely, the Pakistan’s North-Western province (Khyber Pakhtunkhwa) served as a main prey. However, it also served as an equal partner to halt to and control the extremist activities. This current extremist element has also affected the poets herein, and thus they (poets) used their pen as a sword and depicted this havoc, the nature of extremism they witnessed, and also asked for and supported a positive and durable solution to this menace of extremism and terrorism. Their poetic works portrayed and exhibited various examples of the extremism and its possible solution to ensure peace and harmony. The researcher has taken the liberty to argue that a balanced behaviour and attitude play a key role in the fulfillment of desired actions. The imposition of any set of belief, value and attitude leads to the multiplication of extremism and it is so poisonous that it causes to the destruction of whole human society. This study has found that the present day extremism has led to the emergence of new words, similes, metaphor and other figures of speech to be a part of the language and literature to be survived. These words have been analyzed and discussed in a new getup and meanings; the similes and metaphors describing extremism used by poets and writers of this era. The methodology is based on quantitative, analytical and comparative research. Moreover, this research has discussed indication of new words and figures of speech used by the poets and which are in practice, and impacts of extremism on the modern Urdu poetry of Khyber Pakhtunkhwa.Keywords: extremism, modern Urdu poetry, subcontinent, terrorism
Procedia PDF Downloads 260