Search results for: Ti based alloys
24564 Coupling Large Language Models with Disaster Knowledge Graphs for Intelligent Construction
Authors: Zhengrong Wu, Haibo Yang
Abstract:
In the context of escalating global climate change and environmental degradation, the complexity and frequency of natural disasters are continually increasing. Confronted with an abundance of information regarding natural disasters, traditional knowledge graph construction methods, which heavily rely on grammatical rules and prior knowledge, demonstrate suboptimal performance in processing complex, multi-source disaster information. This study, drawing upon past natural disaster reports, disaster-related literature in both English and Chinese, and data from various disaster monitoring stations, constructs question-answer templates based on large language models. Utilizing the P-Tune method, the ChatGLM2-6B model is fine-tuned, leading to the development of a disaster knowledge graph based on large language models. This serves as a knowledge database support for disaster emergency response.Keywords: large language model, knowledge graph, disaster, deep learning
Procedia PDF Downloads 5624563 Effect of Social Media on Knowledge Work
Authors: Pekka Makkonen, Georgios Lampropoulos, Kerstin Siakas
Abstract:
This paper examines the impact of social media on knowledge work. It discloses and highlights which specific aspects, areas and tasks of knowledge work can be improved by the use of social media. Moreover, the study includes a survey about higher education students’ viewpoints in regard to the use of social media as a means to enhance knowledge work and knowledge sharing. The analysis has been conducted based both on empirical data and on discussions about the sources dealing with knowledge work and how it can be enhanced by using social media. The results show that social media can improve knowledge work, knowledge building and maintenance tasks in which communication, information sharing and collaboration play a vital role. Additionally, by using social media, personal, collaborative and supplementary work activities can be enhanced. Based on the results of the study, we suggest how knowledge work can be enhanced when using the contemporary information and communications technologies (ICTs) of the 21st century and recommend future directions towards improving knowledge work.Keywords: knowledge work, social media, social media services, improving work performance
Procedia PDF Downloads 16124562 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System
Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee
Abstract:
This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation
Procedia PDF Downloads 10124561 Detection of Nutrients Using Honeybee-Mimic Bioelectronic Tongue Systems
Authors: Soo Ho Lim, Minju Lee, Dong In Kim, Gi Youn Han, Seunghun Hong, Hyung Wook Kwon
Abstract:
We report a floating electrode-based bioelectronic tongue mimicking honeybee taste systems for the detection and discrimination of various nutrients. Here, carbon nanotube field effect transistors with floating electrodes (CNT-FET) were hybridized with nanovesicles containing honeybee nutrient receptors, gustatory receptors of Apis mellifera. This strategy enables us to detect nutrient substance with a high sensitivity and selectivity. It could also be utilized for the detection of nutrients in liquid food. This floating electrode-based bioelectronic tongue mimicking insect taste systems can be a simple, but highly effective strategy in many different basic research areas about sensory systems. Moreover, our research provides opportunities to develop various applications such as food screening, and it also can provide valuable insights on insect taste systems.Keywords: taste system, CNT-FET, insect gustatory receptor, biolelectronic tongue
Procedia PDF Downloads 21824560 Development of Database for Risk Assessment Appling to Ballast Water Managements
Authors: Eun-Chan Kim, Jeong-Hwan Oh, Seung-Guk Lee
Abstract:
Billions of tones of ballast water including various aquatic organisms are being carried around the world by ships. When the ballast water is discharged into new environments, some aquatic organisms discharged with ballast water may become invasive and severely disrupt the native ecology. Thus, International Maritime Organization (IMO) adopted the Ballast Water Management Convention in 2004. Regulation A-4 of the convention states that a government in waters under their jurisdiction may grant exemptions to any requirements to ballast water management, but only when they are granted to a ship or ships on a voyage or voyages between specified ports or locations, or to a ship which operates exclusively between specified ports or locations. In order to grant exemptions, risk assessment should be conducted based on the guidelines for risk assessment developed by the IMO. For the risk assessment, it is essential to collect the relevant information and establish a database system. This paper studies the database system for ballast water risk assessment. This database consists of the shipping database, ballast water database, port environment database and species database. The shipping database has been established based on the data collected from the port management information system of Korea Government. For the ballast water database, ballast water discharge has only been estimated by the loading/unloading of the cargoes as the convention has not come into effect yet. The port environment database and species database are being established based on the reference documents, and existing and newly collected monitoring data. This database system has been approved to be a useful system, capable of appropriately analyzing the risk assessment in the all ports of Korea.Keywords: ballast water, IMO, risk assessment, shipping, environment, species
Procedia PDF Downloads 52024559 Higher Education for Sustainable Development and Proposed Performance-based Funding Model for Universities in Ontario: Tensions and Coherence Between Provincial and Federal Policies
Authors: Atiqa Marium
Abstract:
In 2015, all 193 UN Member countries adopted the 2030 Agenda for Sustainable Development, which is an ambitious 15- year plan to address some of the most pressing issues the world faces. Goal 4 is about Quality Education which highlights the importance of inclusive and quality education for sustainable development. Sustainable Development Goal 10 focuses on reducing inequalities within and among countries. In June 2019, Federal Government in Canada released “Towards Canada’s 2030 Agenda National Strategy”, which was an important step to move the 2030 Agenda forward. In April 2019, the Ontario government announced the performance-based funding model for publically assisted colleges and universities in Ontario, which is now part of the universities’ budget 2024-2025. The literature review has shown that the funding model has been implemented by different governments to achieve objectives. However, this model has also resulted in conflicting consequences like reducing university autonomy, education quality/ academic standards, and increased equity concerns. The primary focus of this paper will be to analyze the tensions and coherence between the proposed funding model for education for sustainable development goals and targets set by Canada’s 2030 Agenda National Strategy. Considering that the literature review has provided evidence that the performance-based funding model has resulted in reducing quality of education and increased equity issues in other countries, it will be interesting to see how this proposed funding will align with the SDGs of “Quality Education” and “Reduced Inequalities”. This paper will be well-suited for Volume 4, with the theme of re-visioning institutional impact and sustainability. This paper will underscore the importance of policy coherence between federal and provincial policies for higher education institutions in Ontario for better institutional impact and helping universities in the attainment of goals set in 2030 Agenda towards education for sustainable development.Keywords: performance-based funding model, education for sustainable development, policy coherence, sustainable development gaols
Procedia PDF Downloads 11624558 A FE-Based Scheme for Computing Wave Interaction with Nonlinear Damage and Generation of Harmonics in Layered Composite Structures
Authors: R. K. Apalowo, D. Chronopoulos
Abstract:
A Finite Element (FE) based scheme is presented for quantifying guided wave interaction with Localised Nonlinear Structural Damage (LNSD) within structures of arbitrary layering and geometric complexity. The through-thickness mode-shape of the structure is obtained through a wave and finite element method. This is applied in a time domain FE simulation in order to generate time harmonic excitation for a specific wave mode. Interaction of the wave with LNSD within the system is computed through an element activation and deactivation iteration. The scheme is validated against experimental measurements and a WFE-FE methodology for calculating wave interaction with damage. Case studies for guided wave interaction with crack and delamination are presented to verify the robustness of the proposed method in classifying and identifying damage.Keywords: layered structures, nonlinear ultrasound, wave interaction with nonlinear damage, wave finite element, finite element
Procedia PDF Downloads 16324557 AI-based Optimization Model for Plastics Biodegradable Substitutes
Authors: Zaid Almahmoud, Rana Mahmoud
Abstract:
To mitigate the environmental impacts of throwing away plastic waste, there has been a recent interest in manufacturing and producing biodegradable plastics. Here, we study a new class of biodegradable plastics which are mixed with external natural additives, including catalytic additives that lead to a successful degradation of the resulting material. To recommend the best alternative among multiple materials, we propose a multi-objective AI model that evaluates the material against multiple objectives given the material properties. As a proof of concept, the AI model was implemented in an expert system and evaluated using multiple materials. Our findings showed that Polyethylene Terephalate is potentially the best biodegradable plastic substitute based on its material properties. Therefore, it is recommended that governments shift the attention to the use of Polyethylene Terephalate in the manufacturing of bottles to gain a great environmental and sustainable benefits.Keywords: plastic bottles, expert systems, multi-objective model, biodegradable substitutes
Procedia PDF Downloads 11524556 Data-Driven Crop Advisory – A Use Case on Grapes
Authors: Shailaja Grover, Purvi Tiwari, Vigneshwaran S. R., U. Dinesh Kumar
Abstract:
In India, grapes are one of the most important horticulture crops. Grapes are most vulnerable to downy mildew, which is one of the most devasting diseases. In the absence of a precise weather-based advisory system, farmers spray pesticides on their crops extensively. There are two main challenges associated with using these pesticides. Firstly, most of these sprays were panic sprays, which could have been avoided. Second, farmers use more expensive "Preventive and Eradicate" chemicals than "Systemic, Curative and Anti-sporulate" chemicals. When these chemicals are used indiscriminately, they can enter the fruit and cause health problems such as cancer. This paper utilizes decision trees and predictive modeling techniques to provide grape farmers with customized advice on grape disease management. This model is expected to reduce the overall use of chemicals by approximately 50% and the cost by around 70%. Most of the grapes produced will have relatively low residue levels of pesticides, i.e., below the permissible level.Keywords: analytics in agriculture, downy mildew, weather based advisory, decision tree, predictive modelling
Procedia PDF Downloads 7424555 Lead Chalcogenide Quantum Dots for Use in Radiation Detectors
Authors: Tom Nakotte, Hongmei Luo
Abstract:
Lead chalcogenide-based (PbS, PbSe, and PbTe) quantum dots (QDs) were synthesized for the purpose of implementing them in radiation detectors. Pb based materials have long been of interest for gamma and x-ray detection due to its high absorption cross section and Z number. The emphasis of the studies was on exploring how to control charge carrier transport within thin films containing the QDs. The properties of QDs itself can be altered by changing the size, shape, composition, and surface chemistry of the dots, while the properties of carrier transport within QD films are affected by post-deposition treatment of the films. The QDs were synthesized using colloidal synthesis methods and films were grown using multiple film coating techniques, such as spin coating and doctor blading. Current QD radiation detectors are based on the QD acting as fluorophores in a scintillation detector. Here the viability of using QDs in solid-state radiation detectors, for which the incident detectable radiation causes a direct electronic response within the QD film is explored. Achieving high sensitivity and accurate energy quantification in QD radiation detectors requires a large carrier mobility and diffusion lengths in the QD films. Pb chalcogenides-based QDs were synthesized with both traditional oleic acid ligands as well as more weakly binding oleylamine ligands, allowing for in-solution ligand exchange making the deposition of thick films in a single step possible. The PbS and PbSe QDs showed better air stability than PbTe. After precipitation the QDs passivated with the shorter ligand are dispersed in 2,6-difloupyridine resulting in colloidal solutions with concentrations anywhere from 10-100 mg/mL for film processing applications, More concentrated colloidal solutions produce thicker films during spin-coating, while an extremely concentrated solution (100 mg/mL) can be used to produce several micrometer thick films using doctor blading. Film thicknesses of micrometer or even millimeters are needed for radiation detector for high-energy gamma rays, which are of interest for astrophysics or nuclear security, in order to provide sufficient stopping power.Keywords: colloidal synthesis, lead chalcogenide, radiation detectors, quantum dots
Procedia PDF Downloads 12724554 Right Solution of Geodesic Equation in Schwarzschild Metric and Overall Examination of Physical Laws
Authors: Kwan U. Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim
Abstract:
108 years have passed since a great number of physicists explained astronomical and physical phenomena by solving geodesic equations in the Schwarzschild metric. However, when solving the geodesic equations in Schwarzschild metric, they did not correctly solve one branch of the component of space among spatial and temporal components of four-dimensional force and did not come up with physical laws correctly by means of physical analysis from the results obtained by solving the geodesic equations. In addition, they did not treat the astronomical and physical phenomena in a physical way based on the correct physical laws obtained from the solution of the geodesic equations in the Schwarzschild metric. Therefore, some former scholars mentioned that Einstein’s theoretical basis of a general theory of relativity was obscure and incorrect, but they did not give a correct physical solution to the problems. Furthermore, since the general theory of relativity has not given a quantitative solution to obscure and incorrect problems, the generalization of gravitational theory has not yet been successfully completed, although former scholars have thought of it and tried to do it. In order to solve the problems, it is necessary to explore the obscure and incorrect problems in a general theory of relativity based on the physical laws and to find out the methodology for solving the problems. Therefore, as the first step toward achieving this purpose, the right solution of the geodesic equation in the Schwarzschild metric has been presented. Next, the correct physical laws found by making a physical analysis of the results have been presented, the obscure and incorrect problems have been shown, and an analysis of them has been made based on the physical laws. In addition, the experimental verification of the physical laws found by us has been made.Keywords: equivalence principle, general relativity, geometrodynamics, Schwarzschild, Poincaré
Procedia PDF Downloads 1424553 Numerical Modelling of Dry Stone Masonry Structures Based on Finite-Discrete Element Method
Authors: Ž. Nikolić, H. Smoljanović, N. Živaljić
Abstract:
This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.Keywords: dry stone masonry structures, dynamic load, finite-discrete element method, static load
Procedia PDF Downloads 41424552 Organizational Challenges Facing a Small Recruitment Agency: Case Study of a Firm Based in South India
Authors: Anirban Sengupta
Abstract:
The recruitment industry plays a critical role in connecting employers with talent. While there are many big recruitment firms and big organizations can also afford to have their own recruitment teams, small recruitment agencies form an essential part of the ecosystem serving a vast majority of small and medium sized clients. These clients utilize the services of the recruitment agencies to be able to scale their operations. However, there are significant organizational challenges that a small recruitment agency faces to build a sustainable and growing business. This case study explores the organizational challenges faced by a small recruitment agency in South India in an increasingly competitive landscape. Through this paper, the authors hope to understand, analyze and share the challenges faced by this firm and suggest a systematic approach to address the challenges. The study uses both qualitative and quantitative data collected from the agency’s management and employees based on the year 2024. The findings reveal that the agency struggles with limited resources, unpredictable clients, and a lack of scalable processes and systems, which impacts not only the business outcomes but also key areas like employee performance management, compensation and benefits, and employee well-being. Based on these insights, the study proposes several strategies for overcoming these challenges, such as implementing scalable systems and processes. This research contributes to the understanding of the specific obstacles faced by small recruitment agencies in regional contexts and offers actionable recommendations for improving their organizational health, which may, in turn, positively impact their competitiveness.Keywords: recruitment, organizational challenges, performance management, recruitment technology
Procedia PDF Downloads 924551 Student Researchers and Industry Partnerships Improve Health Management with Data Driven Decisions
Authors: Carole A. South-Winter
Abstract:
Research-based learning gives students the opportunity to experience problems that require critical thinking and idea development. The skills they gain in working through these problems 'hands-on,' develop into attributes that benefit their careers in the professional field. The partnerships developed between students and industries give advantages to both sides. The students gain knowledge and skills that will increase their likelihood of success in the future and the industries are given research on new advancements that will give them a competitive advantage in their given field of work. The future of these partnerships is dependent on the success of current programs, enabling the enhancement and improvement of the research efforts. Once more students can complete research, there will be an increase in reliability of the results for each industry. The overall goal is to continue the support for research-based learning and the partnerships formed between students and industries.Keywords: global healthcare, industry partnerships, research-driven decisions, short-term study abroad
Procedia PDF Downloads 12624550 Microwave Assisted Synthesis and Metal Complexes of Some Copolymers Based on Itaconic Acid
Authors: Mohamed H. El-Newehy, Sameh M. Osman, Moamen S. Refat, Salem S. Al-Deyab, Ayman El-Faham
Abstract:
The two copolymers itaconic acid-methyl methacrylate and itaconic acid-acrylamide have been prepared in different ratio by radical copolymerization in the presence of azobisisobutyronitrile (AIBN) as initiator and using 2-butanone as reaction medium using microwave irradiation. The microwave technique is safe, fast, and gives high yield of the products with high purity in an optimum time, comparing to the traditional conventional heating. All the prepared copolymers were characterized by FT-IR, thermal analysis and elemental microanalysis. The itaconic acid-based copolymers showed a good sensitivity in alkaline media for scavenging Cu (II) and Pb (II). The chelation behavior of both Cu (II) and Pb (II) complexes were checked using FT-IR, thermogravimetric analysis (TGA), and differential scanning calorimetery (DSC). The infrared data are in a good agreement with the coordination through carboxylate-to-metal, in which the copolymers acting as a bidentate ligand.Keywords: microwave synthesis, itaconic acid, copolymerization, scavenging, thermal stability
Procedia PDF Downloads 45824549 Flexible Polyaniline-Based Composite Films for High-Performance Super Capacitors
Authors: A. Khosrozadeh, M. A. Darabi, M. Xing, Q. Wang
Abstract:
Fabrication of a high-performance supercapacitor (SC) using a flexible cellulose-based composite film of polyaniline (PANI), reduced graphene oxide (RGO), and silver nanowires (AgNWs) is reported. The flexibility, high capacitive behaviour, and cyclic stability of the entire device make it a good candidate for wearable SCs. The results show that a capacitance as high as 73.4 F/g (1.6 F/cm2) at a discharge rate of 1.1 A/g is achieved by the device. In addition, the SC demonstrates a power density up to 468.8 W/kg and an energy density up to 5.1 wh/kg. The flexibility of the composite film is attributed to the binding effect of cellulose fibers as well as reinforcing effect of AgNWs. The excellent electrochemical performance of the device is found to be owing to the synergistic effect between PANI/RGO/AgNWs ternary in a cushiony cellulose matrix and porous structure of the composite.Keywords: cellulose, polyaniline, reduced graphene oxide, silver, super capacitor
Procedia PDF Downloads 43024548 Location Detection of Vehicular Accident Using Global Navigation Satellite Systems/Inertial Measurement Units Navigator
Authors: Neda Navidi, Rene Jr. Landry
Abstract:
Vehicle tracking and accident recognizing are considered by many industries like insurance and vehicle rental companies. The main goal of this paper is to detect the location of a car accident by combining different methods. The methods, which are considered in this paper, are Global Navigation Satellite Systems/Inertial Measurement Units (GNSS/IMU)-based navigation and vehicle accident detection algorithms. They are expressed by a set of raw measurements, which are obtained from a designed integrator black box using GNSS and inertial sensors. Another concern of this paper is the definition of accident detection algorithm based on its jerk to identify the position of that accident. In fact, the results convinced us that, even in GNSS blockage areas, the position of the accident could be detected by GNSS/INS integration with 50% improvement compared to GNSS stand alone.Keywords: driver behavior monitoring, integration, IMU, GNSS, monitoring, tracking
Procedia PDF Downloads 23424547 Study of Harmonics Estimation on Analog kWh Meter Using Fast Fourier Transform Method
Authors: Amien Rahardjo, Faiz Husnayain, Iwa Garniwa
Abstract:
PLN used the kWh meter to determine the amount of energy consumed by the household customers. High precision of kWh meter is needed in order to give accuracy results as the accuracy can be decreased due to the presence of harmonic. In this study, an estimation of active power consumed was developed. Based on the first year study results, the largest deviation due to harmonics can reach up to 9.8% in 2200VA and 12.29% in 3500VA with kWh meter analog. In the second year of study, deviation of digital customer meter reaches 2.01% and analog meter up to 9.45% for 3500VA household customers. The aim of this research is to produce an estimation system to calculate the total energy consumed by household customer using analog meter so the losses due to irregularities PLN recording of energy consumption based on the measurement used Analog kWh-meter installed is avoided.Keywords: harmonics estimation, harmonic distortion, kWh meters analog and digital, THD, household customers
Procedia PDF Downloads 48324546 Cooperative Spectrum Sensing Using Hybrid IWO/PSO Algorithm in Cognitive Radio Networks
Authors: Deepa Das, Susmita Das
Abstract:
Cognitive Radio (CR) is an emerging technology to combat the spectrum scarcity issues. This is achieved by consistently sensing the spectrum, and detecting the under-utilized frequency bands without causing undue interference to the primary user (PU). In soft decision fusion (SDF) based cooperative spectrum sensing, various evolutionary algorithms have been discussed, which optimize the weight coefficient vector for maximizing the detection performance. In this paper, we propose the hybrid invasive weed optimization and particle swarm optimization (IWO/PSO) algorithm as a fast and global optimization method, which improves the detection probability with a lesser sensing time. Then, the efficiency of this algorithm is compared with the standard invasive weed optimization (IWO), particle swarm optimization (PSO), genetic algorithm (GA) and other conventional SDF based methods on the basis of convergence and detection probability.Keywords: cognitive radio, spectrum sensing, soft decision fusion, GA, PSO, IWO, hybrid IWO/PSO
Procedia PDF Downloads 46724545 Peer-To-Peer Lending and Macroeconomics: Searching for a Link
Authors: Asror Nigmonov Asqar Ogli, Sitora Inoyatova Amonovna
Abstract:
It has been a decade when the crowdfunding and P2P lending opportunities were created. Today, the market of these modern alternative investments is becoming increasingly complex to navigate. There are overwhelming amount of peer-to-peer lending platforms both in developed and emerging economies. This study looks into this market via the cross country empirical study. In this respect, it tests the effect of various macroeconomic factors on P2P loan lending. Based on the existing literature that largely lacks empirical investigations, it builds regression model that aims to explore the relationship between economy and P2P lending. Though the author found it extremely difficult to compare the findings with earlier studies, this paper had identified certain tendencies in the data and had certain policy implications. However, the paper could not find any significant effect of economic variables on P2P lending. The paper can be considered as a starting point in empirical investigation of P2P lending and highlights room further research based on limitations of the study.Keywords: peer-to-peer lending, crowdfunding, marketplace lending, alternative finance, fintech
Procedia PDF Downloads 19924544 2106 kA/cm² Peak Tunneling Current Density in GaN-Based Resonant Tunneling Diode with an Intrinsic Oscillation Frequency of ~260GHz at Room Temperature
Authors: Fang Liu, JunShuai Xue, JiaJia Yao, GuanLin Wu, ZuMaoLi, XueYan Yang, HePeng Zhang, ZhiPeng Sun
Abstract:
Terahertz spectra is in great demand since last two decades for many photonic and electronic applications. III-Nitride resonant tunneling diode is one of the promising candidates for portable and compact THz sources. Room temperature microwave oscillator based on GaN/AlN resonant tunneling diode was reported in this work. The devices, grown by plasma-assisted molecular-beam epitaxy on free-standing c-plane GaN substrates, exhibit highly repeatable and robust negative differential resistance (NDR) characteristics at room temperature. To improve the interface quality at the active region in RTD, indium surfactant assisted growth is adopted to enhance the surface mobility of metal atoms on growing film front. Thanks to the lowered valley current associated with the suppression of threading dislocation scattering on low dislocation GaN substrate, a positive peak current density of record-high 2.1 MA/cm2 in conjunction with a peak-to-valley current ratio (PVCR) of 1.2 are obtained, which is the best results reported in nitride-based RTDs up to now considering the peak current density and PVCR values simultaneously. When biased within the NDR region, microwave oscillations are measured with a fundamental frequency of 0.31 GHz, yielding an output power of 5.37 µW. Impedance mismatch results in the limited output power and oscillation frequency described above. The actual measured intrinsic capacitance is only 30fF. Using a small-signal equivalent circuit model, the maximum intrinsic frequency of oscillation for these diodes is estimated to be ~260GHz. This work demonstrates a microwave oscillator based on resonant tunneling effect, which can meet the demands of terahertz spectral devices, more importantly providing guidance for the fabrication of the complex nitride terahertz and quantum effect devices.Keywords: GaN resonant tunneling diode, peak current density, microwave oscillation, intrinsic capacitance
Procedia PDF Downloads 13924543 iCCS: Development of a Mobile Web-Based Student Integrated Information System using Hill Climbing Algorithm
Authors: Maria Cecilia G. Cantos, Lorena W. Rabago, Bartolome T. Tanguilig III
Abstract:
This paper describes a conducive and structured information exchange environment for the students of the College of Computer Studies in Manuel S. Enverga University Foundation in. The system was developed to help the students to check their academic result, manage profile, make self-enlistment and assist the students to manage their academic status that can be viewed also in mobile phones. Developing class schedules in a traditional way is a long process that involves making many numbers of choices. With Hill Climbing Algorithm, however, the process of class scheduling, particularly with regards to courses to be taken by the student aligned with the curriculum, can perform these processes and end up with an optimum solution. The proponent used Rapid Application Development (RAD) for the system development method. The proponent also used the PHP as the programming language and MySQL as the database.Keywords: hill climbing algorithm, integrated system, mobile web-based, student information system
Procedia PDF Downloads 38424542 Security Issues in Long Term Evolution-Based Vehicle-To-Everything Communication Networks
Authors: Mujahid Muhammad, Paul Kearney, Adel Aneiba
Abstract:
The ability for vehicles to communicate with other vehicles (V2V), the physical (V2I) and network (V2N) infrastructures, pedestrians (V2P), etc. – collectively known as V2X (Vehicle to Everything) – will enable a broad and growing set of applications and services within the intelligent transport domain for improving road safety, alleviate traffic congestion and support autonomous driving. The telecommunication research and industry communities and standardization bodies (notably 3GPP) has finally approved in Release 14, cellular communications connectivity to support V2X communication (known as LTE – V2X). LTE – V2X system will combine simultaneous connectivity across existing LTE network infrastructures via LTE-Uu interface and direct device-to-device (D2D) communications. In order for V2X services to function effectively, a robust security mechanism is needed to ensure legal and safe interaction among authenticated V2X entities in the LTE-based V2X architecture. The characteristics of vehicular networks, and the nature of most V2X applications, which involve human safety makes it significant to protect V2X messages from attacks that can result in catastrophically wrong decisions/actions include ones affecting road safety. Attack vectors include impersonation attacks, modification, masquerading, replay, MiM attacks, and Sybil attacks. In this paper, we focus our attention on LTE-based V2X security and access control mechanisms. The current LTE-A security framework provides its own access authentication scheme, the AKA protocol for mutual authentication and other essential cryptographic operations between UEs and the network. V2N systems can leverage this protocol to achieve mutual authentication between vehicles and the mobile core network. However, this protocol experiences technical challenges, such as high signaling overhead, lack of synchronization, handover delay and potential control plane signaling overloads, as well as privacy preservation issues, which cannot satisfy the adequate security requirements for majority of LTE-based V2X services. This paper examines these challenges and points to possible ways by which they can be addressed. One possible solution, is the implementation of the distributed peer-to-peer LTE security mechanism based on the Bitcoin/Namecoin framework, to allow for security operations with minimal overhead cost, which is desirable for V2X services. The proposed architecture can ensure fast, secure and robust V2X services under LTE network while meeting V2X security requirements.Keywords: authentication, long term evolution, security, vehicle-to-everything
Procedia PDF Downloads 16724541 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 15924540 Thermoelectric Properties of Doped Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
The transport properties of carriers in polycrystalline silicon film affect the performance of polycrystalline silicon-based devices. They depend strongly on the grain structure, grain boundary trap properties and doping concentration, which in turn are determined by the film deposition and processing conditions. Based on the properties of charge carriers, phonons, grain boundaries and their interactions, the thermoelectric properties of polycrystalline silicon are analyzed with the relaxation time approximation of the Boltz- mann transport equation. With this approach, thermal conductivity, electrical conductivity and Seebeck coefficient as a function of grain size, trap properties and doping concentration can be determined. Experiment on heavily doped polycrystalline silicon is carried out and measurement results are compared with the model.Keywords: conductivity, polycrystalline silicon, relaxation time approximation, Seebeck coefficient, thermoelectric property
Procedia PDF Downloads 12524539 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme
Procedia PDF Downloads 38024538 Network Functions Virtualization-Based Virtual Routing Function Deployment under Network Delay Constraints
Authors: Kenichiro Hida, Shin-Ichi Kuribayashi
Abstract:
NFV-based network implements a variety of network functions with software on general-purpose servers, and this allows the network operator to select any capabilities and locations of network functions without any physical constraints. In this paper, we evaluate the influence of the maximum tolerable network delay on the virtual routing function deployment guidelines which the authors proposed previously. Our evaluation results have revealed the following: (1) the more the maximum tolerable network delay condition becomes severe, the more the number of areas where the route selection function is installed increases and the total network cost increases, (2) the higher the routing function cost relative to the circuit bandwidth cost, the increase ratio of total network cost becomes larger according to the maximum tolerable network delay condition.Keywords: NFV (Network Functions Virtualization), resource allocation, virtual routing function, minimum total network cost
Procedia PDF Downloads 24724537 Improving Vocabulary and Listening Comprehension via Watching French Films without Subtitles: Positive Results
Authors: Yelena Mazour-Matusevich, Jean-Robert Ancheta
Abstract:
This study is based on more than fifteen years of experience of teaching a foreign language, in my case French, to the English-speaking students. It represents a qualitative research on foreign language learners’ reaction and their gains in terms of vocabulary and listening comprehension through repeatedly viewing foreign feature films with the original sountrack but without English subtitles. The initial idea emerged upon realization that the first challenge faced by my students when they find themselves in a francophone environment has been their lack of listening comprehension. Their inability to understand colloquial speech affects not only their academic performance, but their psychological health as well. To remedy this problem, I have designed and applied for many years my own teaching method based on one particular French film, exceptionally suited, for the reasons described in detail in the paper, for the intermediate-advanced level foreign language learners. This project, conducted together with my undergraduate assistant and mentoree J-R Ancheta, aims at showing how the paralinguistic features, such as characters’ facial expressions, settings, music, historical background, images provided before the actual viewing, etc., offer crucial support and enhance students’ listening comprehension. The study, based on students’ interviews, also offers special pedagogical techniques, such as ‘anticipatory’ vocabulary lists and exercises, drills, quizzes and composition topics that have proven to boost students’ performance. For this study, only the listening proficiency and vocabulary gains of the interviewed participants were assessed.Keywords: comprehension, film, listening, subtitles, vocabulary
Procedia PDF Downloads 62524536 Speedup Breadth-First Search by Graph Ordering
Abstract:
Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.Keywords: breadth-first search, BFS, graph ordering, graph algorithm
Procedia PDF Downloads 13824535 Ensemble-Based SVM Classification Approach for miRNA Prediction
Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam
Abstract:
In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data
Procedia PDF Downloads 349