Search results for: VSS (Vector Space Similarity)
2542 Improving Cell Type Identification of Single Cell Data by Iterative Graph-Based Noise Filtering
Authors: Annika Stechemesser, Rachel Pounds, Emma Lucas, Chris Dawson, Julia Lipecki, Pavle Vrljicak, Jan Brosens, Sean Kehoe, Jason Yap, Lawrence Young, Sascha Ott
Abstract:
Advances in technology make it now possible to retrieve the genetic information of thousands of single cancerous cells. One of the key challenges in single cell analysis of cancerous tissue is to determine the number of different cell types and their characteristic genes within the sample to better understand the tumors and their reaction to different treatments. For this analysis to be possible, it is crucial to filter out background noise as it can severely blur the downstream analysis and give misleading results. In-depth analysis of the state-of-the-art filtering methods for single cell data showed that they do, in some cases, not separate noisy and normal cells sufficiently. We introduced an algorithm that filters and clusters single cell data simultaneously without relying on certain genes or thresholds chosen by eye. It detects communities in a Shared Nearest Neighbor similarity network, which captures the similarities and dissimilarities of the cells by optimizing the modularity and then identifies and removes vertices with a weak clustering belonging. This strategy is based on the fact that noisy data instances are very likely to be similar to true cell types but do not match any of these wells. Once the clustering is complete, we apply a set of evaluation metrics on the cluster level and accept or reject clusters based on the outcome. The performance of our algorithm was tested on three datasets and led to convincing results. We were able to replicate the results on a Peripheral Blood Mononuclear Cells dataset. Furthermore, we applied the algorithm to two samples of ovarian cancer from the same patient before and after chemotherapy. Comparing the standard approach to our algorithm, we found a hidden cell type in the ovarian postchemotherapy data with interesting marker genes that are potentially relevant for medical research.Keywords: cancer research, graph theory, machine learning, single cell analysis
Procedia PDF Downloads 1122541 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 472540 Contextual SenSe Model: Word Sense Disambiguation using Sense and Sense Value of Context Surrounding the Target
Authors: Vishal Raj, Noorhan Abbas
Abstract:
Ambiguity in NLP (Natural language processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential am-biguities. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a novel method to create an affinity matrix to calculate the affinity be-tween any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an al-gorithm to create the sense clusters of tokens using affinity matrix under hierar-chy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contex-tual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and chal-lenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.Keywords: word sense disambiguation (wsd), contextual sense model (csm), most frequent sense (mfs), part of speech (pos), natural language processing (nlp), oov (out of vocabulary), lemma_pos (a token where lemma and pos of word are joined by underscore), information retrieval (ir), machine translation (mt)
Procedia PDF Downloads 1072539 Modeling and Simulation of Underwater Flexible Manipulator as Raleigh Beam Using Bond Graph
Authors: Sumit Kumar, Sunil Kumar, Chandan Deep Singh
Abstract:
This paper presents modeling and simulation of flexible robot in an underwater environment. The underwater environment completely contrasts with ground or space environment. The robot in an underwater situation is subjected to various dynamic forces like buoyancy forces, hydrostatic and hydrodynamic forces. The underwater robot is modeled as Rayleigh beam. The developed model further allows estimating the deflection of tip in two directions. The complete dynamics of the underwater robot is analyzed, which is the main focus of this investigation. The control of robot trajectory is not discussed in this paper. Simulation is performed using Symbol Shakti software.Keywords: bond graph modeling, dynamics. modeling, rayleigh beam, underwater robot
Procedia PDF Downloads 5872538 A Method to Saturation Modeling of Synchronous Machines in d-q Axes
Authors: Mohamed Arbi Khlifi, Badr M. Alshammari
Abstract:
This paper discusses the general methods to saturation in the steady-state, two axis (d & q) frame models of synchronous machines. In particular, the important role of the magnetic coupling between the d-q axes (cross-magnetizing phenomenon), is demonstrated. For that purpose, distinct methods of saturation modeling of dumper synchronous machine with cross-saturation are identified, and detailed models synthesis in d-q axes. A number of models are given in the final developed form. The procedure and the novel models are verified by a critical application to prove the validity of the method and the equivalence between all developed models is reported. Advantages of some of the models over the existing ones and their applicability are discussed.Keywords: cross-magnetizing, models synthesis, synchronous machine, saturated modeling, state-space vectors
Procedia PDF Downloads 4542537 Design and Radio Frequency Characterization of Radial Reentrant Narrow Gap Cavity for the Inductive Output Tube
Authors: Meenu Kaushik, Ayon K. Bandhoyadhayay, Lalit M. Joshi
Abstract:
Inductive output tubes (IOTs) are widely used as microwave power amplifiers for broadcast and scientific applications. It is capable of amplifying radio frequency (RF) power with very good efficiency. Its compactness, reliability, high efficiency, high linearity and low operating cost make this device suitable for various applications. The device consists of an integrated structure of electron gun and RF cavity, collector and focusing structure. The working principle of IOT is a combination of triode and klystron. The cathode lies in the electron gun produces a stream of electrons. A control grid is placed in close proximity to the cathode. Basically, the input part of IOT is the integrated structure of gridded electron gun which acts as an input cavity thereby providing the interaction gap where the input RF signal is applied to make it interact with the produced electron beam for supporting the amplification phenomena. The paper presents the design, fabrication and testing of a radial re-entrant cavity for implementing in the input structure of IOT at 350 MHz operating frequency. The model’s suitability has been discussed and a generalized mathematical relation has been introduced for getting the proper transverse magnetic (TM) resonating mode in the radial narrow gap RF cavities. The structural modeling has been carried out in CST and SUPERFISH codes. The cavity is fabricated with the Aluminum material and the RF characterization is done using vector network analyzer (VNA) and the results are presented for the resonant frequency peaks obtained in VNA.Keywords: inductive output tubes, IOT, radial cavity, coaxial cavity, particle accelerators
Procedia PDF Downloads 1242536 Simulation of an Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform
Authors: Shield B. Lin, Sameer Abdali
Abstract:
Computer simulations were performed using MATLAB/Simulink for a vibration isolation system for astronaut’s exercise platform. Simulation parameters initially were based on an on-going experiment in a laboratory at NASA Johnson Space Center. The authors expanded later simulations to include other parameters. A discrete proportional-integral-derivative controller with a low-pass filter commanding a linear actuator served as the active control unit to push and pull a counterweight in balancing the disturbance forces. A spring-damper device is used as an optional passive control unit. Simulation results indicated such design could achieve near complete vibration isolation with small displacements of the exercise platform.Keywords: control, counterweight, isolation, vibration
Procedia PDF Downloads 1492535 Trust in Virtual Groups: An Exploratory Study Applied to University Students in Kuwait
Authors: Bashaiar Alsanaa
Abstract:
Emerging technologies present human interaction with new challenges. Individuals are required to interact and collaborate to achieve mutual gain. Accomplishing shared goals requires all parties involved to trust others’ commitment to fulfilling their specified obligations. Trust is harder to establish when groups work virtually and members transcend time, space, and culture. This paper identifies the importance of trust in virtual groups of students at Kuwait University by exposing them to electronic projects on which they collaborate. Students respond to a survey to assess their range of trust within their teams and how the outcome is affected. Gender differences and other demographic factors are analyzed to understand results and rates of trust. The paper concludes with summarizing factors influencing trust development and possible implications.Keywords: groups, students, trust, virtual
Procedia PDF Downloads 2922534 Value Engineering Change Proposal Application in Construction of Road-Building Projects
Authors: Mohammad Mahdi Hajiali
Abstract:
Many of construction projects estimated in Iran have been influenced by the limitations of financial resources. As for Iran, a country that is developing, and to follow this development-oriented approach which many numbers of projects each year run in, if we can reduce the cost of projects by applying a method we will help greatly to minimize the cost of major construction projects and therefore projects will finish faster and more efficiently. One of the components of transportation infrastructure are roads that are considered to have a considerable share of the country budget. In addition, major budget of the related ministry is spending to repair, improve and maintain roads. Value Engineering is a simple and powerful methodology over the past six decades that has been successful in reducing the cost of many projects. Specific solution for using value engineering in the stage of project implementation is called value engineering change proposal (VECP). It was tried in this research to apply VECP in one of the road-building projects in Iran in order to enhance the value of this kind of projects and reduce their cost. In this case study after applying VECP, an idea was raised. It was about use of concrete pavement instead of hot mixed asphalt (HMA) and also using fiber in order to improve concrete pavement performance. VE group team made a decision that for choosing the best alternatives, get expert’s opinions in pavement systems and use Fuzzy TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) for ranking opinions of the experts. Finally, Jointed Plain Concrete Pavement (JPCP) was selected. Group also experimented concrete samples with available fibers in Iran and the results of experiments showed a significant increment in concrete specifications such as flexural strength. In the end, it was shown that by using of fiber-reinforced concrete pavement instead of asphalt pavement, we can achieve a significant saving in cost, time and also increment in quality, durability, and longevity.Keywords: road-building projects, value engineering change proposal (VECP), Jointed Plain Concrete Pavement (JPCP), Fuzzy TOPSIS, fiber-reinforced concrete
Procedia PDF Downloads 1962533 Identification Algorithm of Critical Interface, Modelling Perils on Critical Infrastructure Subjects
Authors: Jiří. J. Urbánek, Hana Malachová, Josef Krahulec, Jitka Johanidisová
Abstract:
The paper deals with crisis situations investigation and modelling within the organizations of critical infrastructure. Every crisis situation has an origin in the emergency event occurrence in the organizations of energetic critical infrastructure especially. Here, the emergency events can be both the expected events, then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping or the unexpected event (Black Swan effect) – without pre-prepared scenario, but it needs operational coping of crisis situations as well. The forms, characteristics, behaviour and utilization of crisis scenarios have various qualities, depending on real critical infrastructure organization prevention and training processes. An aim is always better organizational security and continuity obtainment. This paper objective is to find and investigate critical/ crisis zones and functions in critical situations models of critical infrastructure organization. The DYVELOP (Dynamic Vector Logistics of Processes) method is able to identify problematic critical zones and functions, displaying critical interfaces among actors of crisis situations on the DYVELOP maps named Blazons. Firstly, for realization of this ability is necessary to derive and create identification algorithm of critical interfaces. The locations of critical interfaces are the flags of crisis situation in real organization of critical infrastructure. Conclusive, the model of critical interface will be displayed at real organization of Czech energetic crisis infrastructure subject in Black Out peril environment. The Blazons need live power Point presentation for better comprehension of this paper mission.Keywords: algorithm, crisis, DYVELOP, infrastructure
Procedia PDF Downloads 4092532 Joint Path and Push Planning among Moveable Obstacles
Authors: Victor Emeli, Akansel Cosgun
Abstract:
This paper explores the navigation among movable obstacles (NAMO) problem and proposes joint path and push planning: which path to take and in what direction the obstacles should be pushed at, given a start and goal position. We present a planning algorithm for selecting a path and the obstacles to be pushed, where a rapidly-exploring random tree (RRT)-based heuristic is employed to calculate a minimal collision path. When it is necessary to apply a pushing force to slide an obstacle out of the way, the planners leverage means-end analysis through a dynamic physics simulation to determine the sequence of linear pushes to clear the necessary space. Simulation experiments show that our approach finds solutions in higher clutter percentages (up to 49%) compared to the straight-line push planner (37%) and RRT without pushing (18%).Keywords: motion planning, path planning, push planning, robot navigation
Procedia PDF Downloads 1652531 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis
Procedia PDF Downloads 3262530 Reemergence of Behaviorism in Language Teaching
Authors: Hamid Gholami
Abstract:
During the years, the language teaching methods have been the offshoots of schools of thought in psychology. The methods were mainly influenced by their contemporary psychological approaches, as Audiolingualism was based on behaviorism and Communicative Language Teaching on constructivism. In 1950s, the text books were full of repetition exercises which were encouraged by Behaviorism. In 1980s they got filled with communicative exercises as suggested by constructivism. The trend went on to nowadays that sees no specific method as prevalent since none of the schools of thought seem to be illustrative of the complexity in human being learning. But some changes can be notable; some textbooks are giving more and more space to repetition exercises at least to enhance some aspects of language proficiency, namely collocations, rhythm and intonation, and conversation models. These changes may mark the reemergence of one of the once widely accepted schools of thought in psychology; behaviorism.Keywords: language teaching methods, psychology, schools of thought, Behaviorism
Procedia PDF Downloads 5602529 Database Management System for Orphanages to Help Track of Orphans
Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta
Abstract:
Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.Keywords: database, orphans, programming, C⁺⁺
Procedia PDF Downloads 1562528 The System Dynamics Research of China-Africa Trade, Investment and Economic Growth
Authors: Emma Serwaa Obobisaa, Haibo Chen
Abstract:
International trade and outward foreign direct investment are important factors which are generally recognized in the economic growth and development. Though several scholars have struggled to reveal the influence of trade and outward foreign direct investment (FDI) on economic growth, most studies utilized common econometric models such as vector autoregression and aggregated the variables, which for the most part prompts, however, contradictory and mixed results. Thus, there is an exigent need for the precise study of the trade and FDI effect of economic growth while applying strong econometric models and disaggregating the variables into its separate individual variables to explicate their respective effects on economic growth. This will guarantee the provision of policies and strategies that are geared towards individual variables to ensure sustainable development and growth. This study, therefore, seeks to examine the causal effect of China-Africa trade and Outward Foreign Direct Investment on the economic growth of Africa using a robust and recent econometric approach such as system dynamics model. Our study impanels and tests an ensemble of a group of vital variables predominant in recent studies on trade-FDI-economic growth causality: Foreign direct ınvestment, international trade and economic growth. Our results showed that the system dynamics method provides accurate statistical inference regarding the direction of the causality among the variables than the conventional method such as OLS and Granger Causality predominantly used in the literature as it is more robust and provides accurate, critical values.Keywords: economic growth, outward foreign direct investment, system dynamics model, international trade
Procedia PDF Downloads 1072527 Quintic Spline Solution of Fourth-Order Parabolic Equations Arising in Beam Theory
Authors: Reza Mohammadi, Mahdieh Sahebi
Abstract:
We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the derived method. Numerical comparison with other methods shows the superiority of presented scheme.Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points
Procedia PDF Downloads 3522526 A Study on Spatial Morphological Cognitive Features of Lidukou Village Based on Space Syntax
Authors: Man Guo, Wenyong Tan
Abstract:
By combining spatial syntax with data obtained from field visits, this paper interprets the internal relationship between spatial morphology and spatial cognition in Lidukou Village. By comparing the obtained data, it is recognized that the spatial integration degree of Lidukou Village is positively correlated with the spatial cognitive intention of local villagers. The part with a higher spatial cognitive degree within the village is distributed along the axis mainly composed of Shuxiang Road. And the accessibility of historical relics is weak, and there is no systematic relationship between them. Aiming at the morphological problem of Lidukou Village, optimization strategies have been proposed from multiple perspectives, such as optimizing spatial mechanisms and shaping spatial nodes.Keywords: traditional villages, spatial syntax, spatial integration degree, morphological problem
Procedia PDF Downloads 522525 Telepsychiatry for Asian Americans
Authors: Jami Wang, Brian Kao, Davin Agustines
Abstract:
COVID-19 highlighted the active discrimination against the Asian American population easily seen through media, social tension, and increased crimes against the specific population. It is well known that long-term racism can also have a large impact on both emotional and psychological well-being. However, the healthcare disparity during this time also revealed how the Asian American community lacked the research data, political support, and medical infrastructure for this particular population. During a time when Asian American fear for safety with decreasing mental health, telepsychiatry is particularly promising. COVID-19 demonstrated how well psychiatry could integrate with telemedicine, with psychiatry being the second most utilized telemedicine visits. However, the Asian American community did not utilize the telepsychiatry resources as much as other groups. Because of this, we wanted to understand why the patient population who was affected the most by COVID-19 mentally did not seek out care. To do this, we decided to study the top top telepsychiatry platforms. The current top telepsychiatry companies in the United States include Teladoc and BetterHelp. In the Teladoc mental health sector, they only had 4 available languages (English, Spanish, French, and Danis,) with none of them being an Asian language. In a similar manner, Teladoc’s top competitor in the telepsychiatry space, BetterHelp, only listed a total of 3 Asian languages, including Mandarin, Japanese, and Malaysian. However, this is still a short list considering they have over 20 languages available. The shortage of available physicians that speak multiple languages is concerning, as it could be difficult for the Asian American community to relate with. There are limited mental health resources that cater to their likely cultural needs, further exacerbating the structural racism and institutional barriers to appropriate care. It is important to note that these companies do provide interpreters to comply with the nondiscrimination and language assistance federal law. However, interactions with an interpreter are not only more time-consuming but also less personal than talking directly with a physician. Psychiatry is the field that emphasizes interpersonal relationships. The trust between a physician and the patient is critical in developing patient rapport to guide in better understanding the clinical picture and treating the patient appropriately. The language barrier creates an additional barrier between the physician and patient. Because Asian Americans are one of the largest growing patient population bases, these telehealth companies have much to gain by catering to the Asian American market. Without providing adequate access to bilingual and bicultural physicians, the current system will only further exacerbate the growing disparity. The healthcare community and telehealth companies need to recognize that the Asian American population is a severely underserved population in mental health and has much to gain from telepsychiatry. The lack of language is one of many reasons why there is a disparity for Asian Americans in the mental health space.Keywords: telemedicine, psychiatry, Asian American, disparity
Procedia PDF Downloads 1052524 6D Posture Estimation of Road Vehicles from Color Images
Authors: Yoshimoto Kurihara, Tad Gonsalves
Abstract:
Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.Keywords: 6D posture estimation, image recognition, deep learning, AlexNet
Procedia PDF Downloads 1552523 Contourlet Transform and Local Binary Pattern Based Feature Extraction for Bleeding Detection in Endoscopic Images
Authors: Mekha Mathew, Varun P Gopi
Abstract:
Wireless Capsule Endoscopy (WCE) has become a great device in Gastrointestinal (GI) tract diagnosis, which can examine the entire GI tract, especially the small intestine without invasiveness and sedation. Bleeding in the digestive tract is a symptom of a disease rather than a disease itself. Hence the detection of bleeding is important in diagnosing many diseases. In this paper we proposes a novel method for distinguishing bleeding regions from normal regions based on Contourlet transform and Local Binary Pattern (LBP). Experiments show that this method provides a high accuracy rate of 96.38% in CIE XYZ colour space for k-Nearest Neighbour (k-NN) classifier.Keywords: Wireless Capsule Endoscopy, local binary pattern, k-NN classifier, contourlet transform
Procedia PDF Downloads 4852522 The Role of Validity and Reliability in the Development of Online Testing
Authors: Ani Demetrashvili
Abstract:
The purpose of this paper is to show how students trust online tests and determine validity and reliability in the development of online testing. The pandemic situation changed every field in the world, and it changed education as well. Educational institutions moved into the online space, which was the only decision they were able to make at that time. Online assessment through online proctoring was a totally new challenge for educational institutions, and they needed to deal with it successfully. Participants were chosen from the English language center. The validity of the questionnaire was identified according to the Likert scale and Cronbach’s alpha; later, data from the participants was analyzed as well. The article summarizes literature that is available about online assessment and is interesting for people who are interested in this kind of assessment. Based on the research findings, students favor in-person testing over online assessment due to their lack of experience and skills in the latter.Keywords: online assessment, online proctoring
Procedia PDF Downloads 402521 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values
Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi
Abstract:
A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.Keywords: eXtreme gradient boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impair, multiclass classification, ADNI, support vector machine, random forest
Procedia PDF Downloads 1882520 Fragment Domination for Many-Objective Decision-Making Problems
Authors: Boris Djartov, Sanaz Mostaghim
Abstract:
This paper presents a number-based dominance method. The main idea is how to fragment the many attributes of the problem into subsets suitable for the well-established concept of Pareto dominance. Although other similar methods can be found in the literature, they focus on comparing the solutions one objective at a time, while the focus of this method is to compare entire subsets of the objective vector. Given the nature of the method, it is computationally costlier than other methods and thus, it is geared more towards selecting an option from a finite set of alternatives, where each solution is defined by multiple objectives. The need for this method was motivated by dynamic alternate airport selection (DAAS). In DAAS, pilots, while en route to their destination, can find themselves in a situation where they need to select a new landing airport. In such a predicament, they need to consider multiple alternatives with many different characteristics, such as wind conditions, available landing distance, the fuel needed to reach it, etc. Hence, this method is primarily aimed at human decision-makers. Many methods within the field of multi-objective and many-objective decision-making rely on the decision maker to initially provide the algorithm with preference points and weight vectors; however, this method aims to omit this very difficult step, especially when the number of objectives is so large. The proposed method will be compared to Favour (1 − k)-Dom and L-dominance (LD) methods. The test will be conducted using well-established test problems from the literature, such as the DTLZ problems. The proposed method is expected to outperform the currently available methods in the literature and hopefully provide future decision-makers and pilots with support when dealing with many-objective optimization problems.Keywords: multi-objective decision-making, many-objective decision-making, multi-objective optimization, many-objective optimization
Procedia PDF Downloads 912519 Results concerning the University: Industry Partnership for a Research Project Implementation (MUROS) in the Romanian Program Star
Authors: Loretta Ichim, Dan Popescu, Grigore Stamatescu
Abstract:
The paper reports the collaboration between a top university from Romania and three companies for the implementation of a research project in a multidisciplinary domain, focusing on the impact and benefits both for the education and industry. The joint activities were developed under the Space Technology and Advanced Research Program (STAR), funded by the Romanian Space Agency (ROSA) for a university-industry partnership. The context was defined by linking the European Space Agency optional programs, with the development and promotion national research, with the educational and industrial capabilities in the aeronautics, security and related areas by increasing the collaboration between academic and industrial entities as well as by realizing high-level scientific production. The project name is Multisensory Robotic System for Aerial Monitoring of Critical Infrastructure Systems (MUROS), which was carried 2013-2016. The project included the University POLITEHNICA of Bucharest (coordinator) and three companies, which manufacture and market unmanned aerial systems. The project had as main objective the development of an integrated system for combined ground wireless sensor networks and UAV monitoring in various application scenarios for critical infrastructure surveillance. This included specific activities related to fundamental and applied research, technology transfer, prototype implementation and result dissemination. The core area of the contributions laid in distributed data processing and communication mechanisms, advanced image processing and embedded system development. Special focus is given by the paper to analyzing the impact the project implementation in the educational process, directly or indirectly, through the faculty members (professors and students) involved in the research team. Three main directions are discussed: a) enabling students to carry out internships at the partner companies, b) handling advanced topics and industry requirements at the master's level, c) experiments and concept validation for doctoral thesis. The impact of the research work (as the educational component) developed by the faculty members on the increasing performances of the companies’ products is highlighted. The collaboration between university and companies was well balanced both for contributions and results. The paper also presents the outcomes of the project which reveals the efficient collaboration between high education and industry: master thesis, doctoral thesis, conference papers, journal papers, technical documentation for technology transfer, prototype, and patent. The experience can provide useful practices of blending research and education within an academia-industry cooperation framework while the lessons learned represent a starting point in debating the new role of advanced research and development performing companies in association with higher education. This partnership, promoted at UE level, has a broad impact beyond the constrained scope of a single project and can develop into long-lasting collaboration while benefiting all stakeholders: students, universities and the surrounding knowledge-based economic and industrial ecosystem. Due to the exchange of experiences between the university (UPB) and the manufacturing company (AFT Design), a new project, SIMUL, under the Bridge Grant Program (Romanian executive agency UEFISCDI) was started (2016 – 2017). This project will continue the educational research for innovation on master and doctoral studies in MUROS thematic (collaborative multi-UAV application for flood detection).Keywords: education process, multisensory robotic system, research and innovation project, technology transfer, university-industry partnership
Procedia PDF Downloads 2392518 TNF Receptor-Associated Factor 6 (TRAF6) Mediating the Angiotensin-Induced Non-Canonical TGFβ Pathway Activation and Differentiation of c-kit+ Cardiac Stem Cells
Authors: Qing Cao, Fei Wang, Yu-Qiang Wang, Li-Ya Huang, Tian-Tian Sang, Shu-Yan Chen
Abstract:
Aims: TNF Receptor-Associated Factor 6 (TRAF6) acts as a multifunctional regulator of the Transforming Growth Factor (TGF)-β signaling pathway, and mediates Smad-independent JNK and p38 activation via TGF-β. This study was performed to test the hypothesis that TGF-β/TRAF6 is essential for angiotensin-II (Ang II)-induced differentiation of rat c-kit+ Cardiac Stem Cells (CSCs). Methods and Results: c-kit+ CSCs were isolated from neonatal Sprague Dawley (SD) rats, and their c-kit status was confirmed with immunofluorescence staining. A TGF-β type I receptor inhibitor (SB431542) or the small interfering RNA (siRNA)-mediated knockdown of TRAF6 were used to investigate the role of TRAF6 in TGF-β signaling. Rescue of TRAF6 siRNA transfected cells with a 3'UTR deleted siRNA insensitive construct was conducted to rule out the off target effects of the siRNA. TRAF6 dominant negative (TRAF6DN) vector was constructed and used to infect c-kit+ CSCs, and western blotting was used to assess the expression of TRAF6, JNK, p38, cardiac-specific proteins, and Wnt signaling proteins. Physical interactions between TRAF6 and TGFβ receptors were studied by coimmunoprecipitation. Cardiac differentiation was suppressed in the absence of TRAF6. Forced expression of TRAF6 enhanced the expression of TGF-β-activated kinase1 (TAK1), and inhibited Wnt signaling. Furthermore, TRAF6 increased the expression of cardiac-specific proteins (cTnT and Cx-43) but inhibited the expression of Wnt3a. Conclusions: Our data suggest that TRAF6 plays an important role in Ang II induced differentiation of c-kit+ CSCs via the non-canonical signaling pathway.Keywords: cardiac stem cells, differentiation, TGF-β, TRAF6, ubiquitination, Wnt
Procedia PDF Downloads 4012517 The Analysis of New Town Hillside Development Pattern Guided by Low-Intensity Damage
Authors: Shan Zhou, Wenju Li, Kehui Chai
Abstract:
Along with economic globalization, marketization and regional development, strengthen planning and construction of the New Town, which is always the main way to optimize the structure and function of metropolitan spatial configuration. But, the new town is often of high-intensity development, bringing a series of natural, ecological and environmental issues, so it is difficult to achieve sustainable development. In this paper, taking the administrative center of Jiangping in Dongxing as an example. It is analyzed from the following three aspects:Vertical design of road traffic,Space layout of mountain buildings,and the design of landscape. The purpose is to elaborate the hillside design methods guided by low-intensity damage, and explore the guiding significance of sustainable development of the hillside construction in the future.Keywords: low-intensity damage, new town construction,hillside,sustainable development, natural, ecology
Procedia PDF Downloads 4702516 Lignin Phenol Formaldehyde Resole Resin: Synthesis and Characteristics
Authors: Masoumeh Ghorbania, Falk Liebnerb, Hendrikus W.G. van Herwijnenc, Johannes Konnertha
Abstract:
Phenol formaldehyde (PF) resins are widely used as wood adhesives for variety of industrial products such as plywood, laminated veneer lumber and others. Lignin as a main constituent of wood has become well-known as a potential substitute for phenol in PF adhesives because of their structural similarity. During the last decades numerous research approaches have been carried out to substitute phenol with pulping-derived lignin, whereby the lower reactivity of resins synthesized with shares of lignin seem to be one of the major challenges. This work reports about a systematic screening of different types of lignin (plant origin and pulping process) for their suitability to replace phenol in phenolic resins. Lignin from different plant sources (softwood, hardwood and grass) were used, as these should differ significantly in their reactivity towards formaldehyde of their reactive phenolic core units. Additionally a possible influence of the pulping process was addressed by using the different types of lignin from soda, kraft, and organosolv process and various lignosulfonates (sodium, ammonium, calcium, magnesium). To determine the influence of lignin on the adhesive performance beside others the rate of viscosity development, bond strength development of varying hot pressing time and other thermal properties were investigated. To evaluate the performance of the cured end product, a few selected properties were studied at the example of solid wood-adhesive bond joints, compact panels and plywood. As main results it was found that lignin significantly accelerates the viscosity development in adhesive synthesis. Bonding strength development during curing of adhesives decelerated for all lignin types, while this trend was least for pine kraft lignin and spruce sodium lignosulfonate. However, the overall performance of the products prepared with the latter adhesives was able to fulfill main standard requirements, even after exposing the products to harsh environmental conditions. Thus, a potential application can be considered for processes where reactivity is less critical but adhesive cost and product performance is essential.Keywords: phenol formaldehyde resin, lignin phenol formaldehyde resin, ABES, DSC
Procedia PDF Downloads 2372515 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 1582514 Improving the Efficiency of Repacking Process with Lean Technique: The Study of Read With Me Group Company Limited
Authors: Jirayut Phetchuen, Jongkol Srithorn
Abstract:
The study examines the unloading and repacking process of Read With Me Group Company Limited. The research aims to improve the old work process and build a new efficient process with the Lean Technique and new machines for faster delivery without increasing the number of employees. Currently, two employees work based on five days on and off. However, workplace injuries have delayed the delivery time, especially the delivery to the neighboring countries. After the process improvement, the working space increased by 25%, the Process Lead Time decreased by 40%, the work efficiency increased by 175.82%, and the work injuries rate was reduced to zero.Keywords: lean technique, plant layout design, U-shaped disassembly line, value stream mapping
Procedia PDF Downloads 1042513 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce
Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park
Abstract:
Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.Keywords: spam keyword, e-commerce, keyword features, spam filtering
Procedia PDF Downloads 294