Search results for: vector information
10541 A GIS-Based Study on Geographical Divisions of Sustainable Human Settlements in China
Authors: Wu Yiqun, Weng Jiantao
Abstract:
The human settlements of China are picked up from the land use vector map by interpreting the Thematic Map of 2014. This paper established the sustainable human settlements geographical division evaluation system and division model using GIS. The results show that: The density of human residential areas in China is different, and the density of sustainable human areas is higher, and the west is lower than that in the West. The regional differences of sustainable human settlements are obvious: the north is larger than that the south, the plain regions are larger than those of the hilly regions, and the developed regions are larger than the economically developed regions. The geographical distribution of the sustainable human settlements is measured by the degree of porosity. The degree of porosity correlates with the sustainable human settlement density. In the area where the sustainable human settlement density is high the porosity is low, the distribution is even and the gap between the settlements is low.Keywords: GIS, geographical division, sustainable human settlements, China
Procedia PDF Downloads 60010540 Self-Efficacy Perceptions of Pre-Service Art and Music Teachers towards the Use of Information and Communication Technologies
Authors: Agah Tugrul Korucu
Abstract:
Information and communication technologies have become an important part of our daily lives with significant investments in technology in the 21st century. Individuals are more willing to design and implement computer-related activities, and they are the main component of computer self-efficacy and self-efficacy related to the fact that the increase in information technology, with operations in parallel with these activities more successful. The Self-efficacy level is a significant factor which determines how individuals act in events, situations and difficult processes. It is observed that individuals with higher self-efficacy perception of computers who encounter problems related to computer use overcome them more easily. Therefore, this study aimed to examine self-efficacy perceptions of pre-service art and music teachers towards the use of information and communication technologies in terms of different variables. Research group consists of 60 pre-service teachers who are studying at Necmettin Erbakan University Ahmet Keleşoğlu Faculty of Education Art and Music department. As data collection tool of the study; “personal information form” developed by the researcher and used to collect demographic data and "the perception scale related to self-efficacy of informational technology" are used. The scale is 5-point Likert-type scale. It consists of 27 items. The Kaiser-Meyer-Olkin (KMO) sample compliance value is found 0.959. The Cronbach alpha reliability coefficient of the scale is found to be 0.97. computer-based statistical software package (SPSS 21.0) is used in order to analyze the data collected by data collection tools; descriptive statistics, t-test, analysis of variance are used as statistical techniques.Keywords: self-efficacy perceptions, teacher candidate, information and communication technologies, art teacher
Procedia PDF Downloads 32610539 Energy Efficient Routing Protocol with Ad Hoc On-Demand Distance Vector for MANET
Authors: K. Thamizhmaran, Akshaya Devi Arivazhagan, M. Anitha
Abstract:
On the case of most important systematic issue that must need to be solved in means of implementing a data transmission algorithm on the source of Mobile adhoc networks (MANETs). That is, how to save mobile nodes energy on meeting the requirements of applications or users as the mobile nodes are with battery limited. On while satisfying the energy saving requirement, hence it is also necessary of need to achieve the quality of service. In case of emergency work, it is necessary to deliver the data on mean time. Achieving quality of service in MANETs is also important on while. In order to achieve this requirement, Hence, we further implement the Energy-Aware routing protocol for system of Mobile adhoc networks were it being proposed, that on which saves the energy as on every node by means of efficiently selecting the mode of energy efficient path in the routing process by means of Enhanced AODV routing protocol.Keywords: Ad-Hoc networks, MANET, routing, AODV, EAODV
Procedia PDF Downloads 37210538 End to End Monitoring in Oracle Fusion Middleware for Data Verification
Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan
Abstract:
In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring
Procedia PDF Downloads 48210537 Social Media Influencers and Tourist’s Hotel Booking Decisions: A Case Study of Facebook
Authors: Fahsai Pawapootanont, Sasithon Yuwakosol
Abstract:
The objectives of this research study are as follows: 1) Study the information-seeking behavior of followers of influencers on Facebook in making hotel booking decisions and 2) Study the characteristics of travel influencers that affect their followers' hotel booking decisions. The Data was collected by interviewing 35 key informants, consisting of 25 Thai tourists who were followers of travel influencers and 10 travel influencers, as well as collecting data using online questionnaires from a sample of 400 Thai tourists and using statistical data analysis: percentage, standard deviation, mean, T-Test and One-Way Analysis of Variance: ANOVA. The results of the influence of travel influencers on Facebook on hotel booking decisions in Thailand revealed the following: People in different age groups have different information-seeking behaviours. Depends on experience and aptitude in using technology. The sample group did not seek information from only one source. There is also a search for information from various places in order to get comparative information and the most truthful information to make decisions. In addition, travel influencers should be those who present honest, clear, and complete content. And present services honestly. In addition to the characteristics of travel influencers affecting hotel booking decisions, Presentation formats and platforms also affect hotel booking decisions. But it must be designed and presented to suit the behavior of the group of people we want. As for the influence of travel influencers, it can be concluded that The influence of travel influencers can influence their followers' interests and hotel booking decisions. However, it was found that there are other factors that followers of travel influencers on Facebook will factor into their decision to book a hotel, such as Whether the hotel's comfort meets your needs or not; location, price, and promotions also play an important role in deciding to book a hotel.Keywords: influencer, travel, facebook, hotel booking decisions, Thailand
Procedia PDF Downloads 5310536 Study on Safety Management of Deep Foundation Pit Construction Site Based on Building Information Modeling
Authors: Xuewei Li, Jingfeng Yuan, Jianliang Zhou
Abstract:
The 21st century has been called the century of human exploitation of underground space. Due to the characteristics of large quantity, tight schedule, low safety reserve and high uncertainty of deep foundation pit engineering, accidents frequently occur in deep foundation pit engineering, causing huge economic losses and casualties. With the successful application of information technology in the construction industry, building information modeling has become a research hotspot in the field of architectural engineering. Therefore, the application of building information modeling (BIM) and other information communication technologies (ICTs) in construction safety management is of great significance to improve the level of safety management. This research summed up the mechanism of the deep foundation pit engineering accident through the fault tree analysis to find the control factors of deep foundation pit engineering safety management, the deficiency existing in the traditional deep foundation pit construction site safety management. According to the accident cause mechanism and the specific process of deep foundation pit construction, the hazard information of deep foundation pit engineering construction site was identified, and the hazard list was obtained, including early warning information. After that, the system framework was constructed by analyzing the early warning information demand and early warning function demand of the safety management system of deep foundation pit. Finally, the safety management system of deep foundation pit construction site based on BIM through combing the database and Web-BIM technology was developed, so as to realize the three functions of real-time positioning of construction site personnel, automatic warning of entering a dangerous area, real-time monitoring of deep foundation pit structure deformation and automatic warning. This study can initially improve the current situation of safety management in the construction site of deep foundation pit. Additionally, the active control before the occurrence of deep foundation pit accidents and the whole process dynamic control in the construction process can be realized so as to prevent and control the occurrence of safety accidents in the construction of deep foundation pit engineering.Keywords: Web-BIM, safety management, deep foundation pit, construction
Procedia PDF Downloads 15410535 LIZTOXD: Inclusive Lizard Toxin Database by Using MySQL Protocol
Authors: Iftikhar A. Tayubi, Tabrej Khan, Mansoor M. Alsubei, Fahad A. Alsaferi
Abstract:
LIZTOXD provides a single source of high-quality information about proteinaceous lizard toxins that will be an invaluable resource for pharmacologists, neuroscientists, toxicologists, medicinal chemists, ion channel scientists, clinicians, and structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to explore the detail information of Lizard and toxin proteins. It includes common name, scientific name, entry id, entry name, protein name and length of the protein sequence. The utility of this database is that it can provide a user-friendly interface for users to retrieve the information about Lizard, toxin and toxin protein of different Lizard species. These interfaces created in this database will satisfy the demands of the scientific community by providing in-depth knowledge about Lizard and its toxin. In the next phase of our project we will adopt methodology and by using A MySQL and Hypertext Preprocessor (PHP) which and for designing Smart Draw. A database is a wonderful piece of equipment for storing large quantities of data efficiently. The users can thus navigate from one section to another, depending on the field of interest of the user. This database contains a wealth of information on species, toxins, toxins, clinical data etc. LIZTOXD resource that provides comprehensive information about protein toxins from lizard toxins. The combination of specific classification schemes and a rich user interface allows researchers to easily locate and view information on the sequence, structure, and biological activity of these toxins. This manually curated database will be a valuable resource for both basic researchers as well as those interested in potential pharmaceutical and agricultural applications of lizard toxins.Keywords: LIZTOXD, MySQL, PHP, smart draw
Procedia PDF Downloads 16210534 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 1910533 Variational Explanation Generator: Generating Explanation for Natural Language Inference Using Variational Auto-Encoder
Authors: Zhen Cheng, Xinyu Dai, Shujian Huang, Jiajun Chen
Abstract:
Recently, explanatory natural language inference has attracted much attention for the interpretability of logic relationship prediction, which is also known as explanation generation for Natural Language Inference (NLI). Existing explanation generators based on discriminative Encoder-Decoder architecture have achieved noticeable results. However, we find that these discriminative generators usually generate explanations with correct evidence but incorrect logic semantic. It is due to that logic information is implicitly encoded in the premise-hypothesis pairs and difficult to model. Actually, logic information identically exists between premise-hypothesis pair and explanation. And it is easy to extract logic information that is explicitly contained in the target explanation. Hence we assume that there exists a latent space of logic information while generating explanations. Specifically, we propose a generative model called Variational Explanation Generator (VariationalEG) with a latent variable to model this space. Training with the guide of explicit logic information in target explanations, latent variable in VariationalEG could capture the implicit logic information in premise-hypothesis pairs effectively. Additionally, to tackle the problem of posterior collapse while training VariaztionalEG, we propose a simple yet effective approach called Logic Supervision on the latent variable to force it to encode logic information. Experiments on explanation generation benchmark—explanation-Stanford Natural Language Inference (e-SNLI) demonstrate that the proposed VariationalEG achieves significant improvement compared to previous studies and yields a state-of-the-art result. Furthermore, we perform the analysis of generated explanations to demonstrate the effect of the latent variable.Keywords: natural language inference, explanation generation, variational auto-encoder, generative model
Procedia PDF Downloads 15110532 Vibration Propagation in Body-in-White Structures Through Structural Intensity Analysis
Authors: Jamal Takhchi
Abstract:
The understanding of vibration propagation in complex structures such as automotive body in white remains a challenging issue in car design regarding NVH performances. The current analysis is limited to the low frequency range where modal concepts are dominant. Higher frequencies, between 200 and 1000 Hz, will become critical With the rise of electrification. EVs annoying sounds are mostly whines created by either Gears or e-motors between 300 Hz and 2 kHz. Structural intensity analysis was Experienced a few years ago on finite element models. The application was promising but limited by the fact that the propagating 3D intensity vector field is masked by a rotational Intensity field. This rotational field should be filtered using a differential operator. The expression of this operator in the framework of finite element modeling is not yet known. The aim of the proposed work is to implement this operator in the current dynamic solver (NASTRAN) of Stellantis and develop the Expected methodology for the mid-frequency structural analysis of electrified vehicles.Keywords: structural intensity, NVH, body in white, irrotatational intensity
Procedia PDF Downloads 15510531 A Model to Assess Sustainability Using Multi-Criteria Analysis and Geographic Information Systems: A Case Study
Authors: Antonio Boggia, Luisa Paolotti, Gianluca Massei, Lucia Rocchi, Elaine Pace, Maria Attard
Abstract:
The aim of this paper is to present a methodology and a computer model for sustainability assessment based on the integration of Multi-criteria Decision Analysis (MCDA) with a Geographic Information System (GIS). It presents the result of a study for the implementation of a model for measuring sustainability to address the policy actions for the improvement of sustainability at territory level. The aim is to rank areas in order to understand the specific technical and/or financial support that is required to develop sustainable growth. Assessing sustainable development is a multidimensional problem: economic, social and environmental aspects have to be taken into account at the same time. The tool for a multidimensional representation is a proper set of indicators. The set of indicators must be integrated into a model, that is an assessment methodology, to be used for measuring sustainability. The model, developed by the Environmental Laboratory of the University of Perugia, is called GeoUmbriaSUIT. It is a calculation procedure developed as a plugin working in the open-source GIS software QuantumGIS. The multi-criteria method used within GeoUmbriaSUIT is the algorithm TOPSIS (Technique for Order Preference by Similarity to Ideal Design), which defines a ranking based on the distance from the worst point and the closeness to an ideal point, for each of the criteria used. For the sustainability assessment procedure, GeoUmbriaSUIT uses a geographic vector file where the graphic data represent the study area and the single evaluation units within it (the alternatives, e.g. the regions of a country, or the municipalities of a region), while the alphanumeric data (attribute table), describe the environmental, economic and social aspects related to the evaluation units by means of a set of indicators (criteria). The use of the algorithm available in the plugin allows to treat individually the indicators representing the three dimensions of sustainability, and to compute three different indices: environmental index, economic index and social index. The graphic output of the model allows for an integrated assessment of the three dimensions, avoiding aggregation. The presence of separate indices and graphic output make GeoUmbriaSUIT a readable and transparent tool, since it doesn’t produce an aggregate index of sustainability as final result of the calculations, which is often cryptic and difficult to interpret. In addition, it is possible to develop a “back analysis”, able to explain the positions obtained by the alternatives in the ranking, based on the criteria used. The case study presented is an assessment of the level of sustainability in the six regions of Malta, an island state in the middle of the Mediterranean Sea and the southernmost member of the European Union. The results show that the integration of MCDA-GIS is an adequate approach for sustainability assessment. In particular, the implemented model is able to provide easy to understand results. This is a very important condition for a sound decision support tool, since most of the time decision makers are not experts and need understandable output. In addition, the evaluation path is traceable and transparent.Keywords: GIS, multi-criteria analysis, sustainability assessment, sustainable development
Procedia PDF Downloads 29210530 Artificial Neural Networks Based Calibration Approach for Six-Port Receiver
Authors: Nadia Chagtmi, Nejla Rejab, Noureddine Boulejfen
Abstract:
This paper presents a calibration approach based on artificial neural networks (ANN) to determine the envelop signal (I+jQ) of a six-port based receiver (SPR). The memory effects called also dynamic behavior and the nonlinearity brought by diode based power detector have been taken into consideration by the ANN. Experimental set-up has been performed to validate the efficiency of this method. The efficiency of this approach has been confirmed by the obtained results in terms of waveforms. Moreover, the obtained error vector magnitude (EVM) and the mean absolute error (MAE) have been calculated in order to confirm and to test the ANN’s performance to achieve I/Q recovery using the output voltage detected by the power based detector. The baseband signal has been recovered using ANN with EVMs no higher than 1 % and an MAE no higher than 17, 26 for the SPR excited different type of signals such QAM (quadrature amplitude modulation) and LTE (Long Term Evolution).Keywords: six-port based receiver; calibration, nonlinearity, memory effect, artificial neural network
Procedia PDF Downloads 7810529 Integrating Dependent Material Planning Cycle into Building Information Management: A Building Information Management-Based Material Management Automation Framework
Authors: Faris Elghaish, Sepehr Abrishami, Mark Gaterell, Richard Wise
Abstract:
The collaboration and integration between all building information management (BIM) processes and tasks are necessary to ensure that all project objectives can be delivered. The literature review has been used to explore the state of the art BIM technologies to manage construction materials as well as the challenges which have faced the construction process using traditional methods. Thus, this paper aims to articulate a framework to integrate traditional material planning methods such as ABC analysis theory (Pareto principle) to analyse and categorise the project materials, as well as using independent material planning methods such as Economic Order Quantity (EOQ) and Fixed Order Point (FOP) into the BIM 4D, and 5D capabilities in order to articulate a dependent material planning cycle into BIM, which relies on the constructability method. Moreover, we build a model to connect between the material planning outputs and the BIM 4D and 5D data to ensure that all project information will be accurately presented throughout integrated and complementary BIM reporting formats. Furthermore, this paper will present a method to integrate between the risk management output and the material management process to ensure that all critical materials are monitored and managed under the all project stages. The paper includes browsers which are proposed to be embedded in any 4D BIM platform in order to predict the EOQ as well as FOP and alarm the user during the construction stage. This enables the planner to check the status of the materials on the site as well as to get alarm when the new order will be requested. Therefore, this will lead to manage all the project information in a single context and avoid missing any information at early design stage. Subsequently, the planner will be capable of building a more reliable 4D schedule by allocating the categorised material with the required EOQ to check the optimum locations for inventory and the temporary construction facilitates.Keywords: building information management, BIM, economic order quantity, EOQ, fixed order point, FOP, BIM 4D, BIM 5D
Procedia PDF Downloads 17310528 Recognition of Grocery Products in Images Captured by Cellular Phones
Authors: Farshideh Einsele, Hassan Foroosh
Abstract:
In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation, style, illumination, and can suffer from perspective distortion. Pre-processing is performed to make the characters scale and rotation invariant. Since text degradations can not be appropriately defined using wellknown geometric transformations such as translation, rotation, affine transformation and shearing, we use the whole character black pixels as our feature vector. Classification is performed with minimum distance classifier using the maximum likelihood criterion, which delivers very promising Character Recognition Rate (CRR) of 89%. We achieve considerably higher Word Recognition Rate (WRR) of 99% when using lower level linguistic knowledge about product words during the recognition process.Keywords: camera-based OCR, feature extraction, document, image processing, grocery products
Procedia PDF Downloads 40610527 Algorithmic Fault Location in Complex Gas Networks
Authors: Soban Najam, S. M. Jahanzeb, Ahmed Sohail, Faraz Idris Khan
Abstract:
With the recent increase in reliance on Gas as the primary source of energy across the world, there has been a lot of research conducted on gas distribution networks. As the complexity and size of these networks grow, so does the leakage of gas in the distribution network. One of the most crucial factors in the production and distribution of gas is UFG or Unaccounted for Gas. The presence of UFG signifies that there is a difference between the amount of gas distributed, and the amount of gas billed. Our approach is to use information that we acquire from several specified points in the network. This information will be used to calculate the loss occurring in the network using the developed algorithm. The Algorithm can also identify the leakages at any point of the pipeline so we can easily detect faults and rectify them within minimal time, minimal efforts and minimal resources.Keywords: FLA, fault location analysis, GDN, gas distribution network, GIS, geographic information system, NMS, network Management system, OMS, outage management system, SSGC, Sui Southern gas company, UFG, unaccounted for gas
Procedia PDF Downloads 62910526 Stream Extraction from 1m-DTM Using ArcGIS
Authors: Jerald Ruta, Ricardo Villar, Jojemar Bantugan, Nycel Barbadillo, Jigg Pelayo
Abstract:
Streams are important in providing water supply for industrial, agricultural and human consumption, In short when there are streams there are lives. Identifying streams are essential since many developed cities are situated in the vicinity of these bodies of water and in flood management, it serves as basin for surface runoff within the area. This study aims to process and generate features from high-resolution digital terrain model (DTM) with 1-meter resolution using Hydrology Tools of ArcGIS. The raster was then filled, processed flow direction and accumulation, then raster calculate and provide stream order, converted to vector, and clearing undesirable features using the ancillary or google earth. In field validation streams were classified whether perennial, intermittent or ephemeral. Results show more than 90% of the extracted feature were accurate in assessment through field validation.Keywords: digital terrain models, hydrology tools, strahler method, stream classification
Procedia PDF Downloads 27410525 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data
Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi
Abstract:
Edgeworth Approximation, Bootstrap, and Monte Carlo Simulations have considerable impacts on achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that has the components of a cash-flow of one of the most successful businesses in the world, as the financial activity, operational activity, and investment activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case, we have created a vector autoregression model, and after that, we have generated the impulse responses in the terms of asymptotic analysis (Edgeworth Approximation), Monte Carlo Simulations, and residual bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.Keywords: autoregression, bootstrap, edgeworth expansion, Monte Carlo method
Procedia PDF Downloads 15510524 Neuron Imaging in Lateral Geniculate Nucleus
Authors: Sandy Bao, Yankang Bao
Abstract:
The understanding of information that is being processed in the brain, especially in the lateral geniculate nucleus (LGN), has been proven challenging for modern neuroscience and for researchers with a focus on how neurons process signals and images. In this paper, we are proposing a method to image process different colors within different layers of LGN, that is, green information in layers 4 & 6 and red & blue in layers 3 & 5 based on the surface dimension of layers. We take into consideration the images in LGN and visual cortex, and that the edge detected information from the visual cortex needs to be considered in order to return back to the layers of LGN, along with the image in LGN to form the new image, which will provide an improved image that is clearer, sharper, and making it easier to identify objects in the image. Matrix Laboratory (MATLAB) simulation is performed, and results show that the clarity of the output image has significant improvement.Keywords: lateral geniculate nucleus, matrix laboratory, neuroscience, visual cortex
Procedia PDF Downloads 28010523 Information Seeking and Evaluation Tasks to Enhance Multiliteracies in Health Education
Authors: Tuula Nygard
Abstract:
This study contributes to the pedagogical discussion on how to promote adolescents’ multiliteracies with the emphasis on information seeking and evaluation skills in contemporary media environments. The study is conducted in the school environment utilizing perspectives of educational sciences and information studies to health communication and teaching. The research focus is on the teacher role as a trusted person, who guides students to choose and use credible information sources. Evaluating the credibility of information may often be challenging. Specifically, children and adolescents may find it difficult to know what to believe and who to trust, for instance, in health and well-being communication. Thus, advanced multiliteracy skills are needed. In the school environment, trust is based on the teacher’s subject content knowledge, but also the teacher’s character and caring. Teacher’s benevolence and approachability generate trustworthiness, which lays the foundation for good interaction with students and further, for the teacher’s pedagogical authority. The study explores teachers’ perceptions of their pedagogical authority and the role of a trustee. In addition, the study examines what kind of multiliteracy practices teachers utilize in their teaching. The data will be collected by interviewing secondary school health education teachers during Spring 2019. The analysis method is a nexus analysis, which is an ethnographic research orientation. Classroom interaction as the interviewed teachers see it is scrutinized through a nexus analysis lens in order to expound a social action, where people, places, discourses, and objects are intertwined. The crucial social actions in this study are information seeking and evaluation situations, where the teacher and the students together assess the credibility of the information sources. The study is based on the hypothesis that a trustee’s opinions of credible sources and guidance in information seeking and evaluation affect students’, that is, trustors’ choices. In the school context, the teacher’s own experiences and perceptions of health-related issues cannot be brushed aside. Furthermore, adolescents are used to utilize digital technology for day-to-day information seeking, but the chosen information sources are often not very high quality. In the school, teachers are inclined to recommend familiar sources, such as health education textbook and web pages of well-known health authorities. Students, in turn, rely on the teacher’s guidance of credible information sources without using their own judgment. In terms of students’ multiliteracy competences, information seeking and evaluation tasks in health education are excellent opportunities to practice and enhance these skills. To distinguish the right information from a wrong one is particularly important in health communication because experts by experience are easy to find and their opinions are convincing. This can be addressed by employing the ideas of multiliteracy in the school subject health education and in teacher education and training.Keywords: multiliteracies, nexus analysis, pedagogical authority, trust
Procedia PDF Downloads 10910522 Intuitive Decision Making When Facing Risks
Authors: Katharina Fellnhofer
Abstract:
The more information and knowledge that technology provides, the more important are profoundly human skills like intuition, the skill of using nonconscious information. As our world becomes more complex, shaken by crises, and characterized by uncertainty, time pressure, ambiguity, and rapidly changing conditions, intuition is increasingly recognized as a key human asset. However, due to methodological limitations of sample size or time frame or a lack of real-world or cross-cultural scope, precisely how to measure intuition when facing risks on a nonconscious level remains unclear. In light of the measurement challenge related to intuition’s nonconscious nature, a technique is introduced to measure intuition via hidden images as nonconscious additional information to trigger intuition. This technique has been tested in a within-subject fully online design with 62,721 real-world investment decisions made by 657 subjects in Europe and the United States. Bayesian models highlight the technique’s potential to measure skill at using nonconscious information for conscious decision making. Over the long term, solving the mysteries of intuition and mastering its use could be of immense value in personal and organizational decision-making contexts.Keywords: cognition, intuition, investment decisions, methodology
Procedia PDF Downloads 8710521 Neural Nets Based Approach for 2-Cells Power Converter Control
Authors: Kamel Laidi, Khelifa Benmansour, Ouahid Bouchhida
Abstract:
Neural networks-based approach for 2-cells serial converter has been developed and implemented. The approach is based on a behavioural description of the different operating modes of the converter. Each operating mode represents a well-defined configuration, and for which is matched an operating zone satisfying given invariance conditions, depending on the capacitors' voltages and the load current of the converter. For each mode, a control vector whose components are the control signals to be applied to the converter switches has been associated. Therefore, the problem is reduced to a classification task of the different operating modes of the converter. The artificial neural nets-based approach, which constitutes a powerful tool for this kind of task, has been adopted and implemented. The application to a 2-cells chopper has allowed ensuring efficient and robust control of the load current and a high capacitors voltages balancing.Keywords: neural nets, control, multicellular converters, 2-cells chopper
Procedia PDF Downloads 83610520 Audio Information Retrieval in Mobile Environment with Fast Audio Classifier
Authors: Bruno T. Gomes, José A. Menezes, Giordano Cabral
Abstract:
With the popularity of smartphones, mobile apps emerge to meet the diverse needs, however the resources at the disposal are limited, either by the hardware, due to the low computing power, or the software, that does not have the same robustness of desktop environment. For example, in automatic audio classification (AC) tasks, musical information retrieval (MIR) subarea, is required a fast processing and a good success rate. However the mobile platform has limited computing power and the best AC tools are only available for desktop. To solve these problems the fast classifier suits, to mobile environments, the most widespread MIR technologies, seeking a balance in terms of speed and robustness. At the end we found that it is possible to enjoy the best of MIR for mobile environments. This paper presents the results obtained and the difficulties encountered.Keywords: audio classification, audio extraction, environment mobile, musical information retrieval
Procedia PDF Downloads 54710519 Agriculture and Global Economy vis-à-vis the Climate Change
Authors: Assaad Ghazouani, Ati Abdessatar
Abstract:
In the world, agriculture maintains a social and economic importance in the national economy. Its importance is distinguished by its ripple effects not only downstream but also upstream vis-à-vis the non-agricultural sector. However, the situation is relatively fragile because of weather conditions. In this work, we propose a model to highlight the impacts of climate change (CC) on economic growth in the world where agriculture is considered as a strategic sector. The CC is supposed to directly and indirectly affect economic growth by reducing the performance of the agricultural sector. The model is tested for Tunisia. The results validate the hypothesis that the potential economic damage of the CC is important. Indeed, an increase in CO2 concentration (temperatures and disruption of rainfall patterns) will have an impact on global economic growth particularly by reducing the performance of the agricultural sector. Analysis from a vector error correction model also highlights the magnitude of climate impact on the performance of the agricultural sector and its repercussions on economic growthKeywords: Climate Change, Agriculture, Economic Growth, World, VECM, Cointegration.
Procedia PDF Downloads 62110518 The Use of Online Courses as a Tool for Teaching in Education for Youth and Adults
Authors: Elineuda Do Socorro Santos Picanço Sousa, Ana Kerlly Souza da Costa
Abstract:
This paper presents the analysis of the information society as a plural, inclusive and participatory society, where it is necessary to give all citizens, especially young people, the right skills in order to develop skills so that they can understand and use information through of contemporary technologies; well as carry out a critical analysis, using and producing information and all sorts of messages and / or informational language codes. This conviction inspired this article, whose aim is to present current trends in the use of technology in distance education applied as an alternative and / or supplement to classroom teaching for Youth and Adults, concepts and actions, seeking to contribute to its development in the state of Amapá and specifically, the Center for Professional of Amapá Teaching Professor Josinete Oliveira Barroso - CEPAJOB.Keywords: youth and adults education, Ead. Professional Education, online courses, CEPAJOB
Procedia PDF Downloads 64210517 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt
Authors: Sahbi Farhani
Abstract:
This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.Keywords: external debt, military spending, ARDL approach, India
Procedia PDF Downloads 29710516 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data
Procedia PDF Downloads 37910515 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Pegah Eshraghi, Zahra Sadat Zomorodian, Mohammad Tahsildoost
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 10010514 Cooperative Spectrum Sensing Using Hybrid IWO/PSO Algorithm in Cognitive Radio Networks
Authors: Deepa Das, Susmita Das
Abstract:
Cognitive Radio (CR) is an emerging technology to combat the spectrum scarcity issues. This is achieved by consistently sensing the spectrum, and detecting the under-utilized frequency bands without causing undue interference to the primary user (PU). In soft decision fusion (SDF) based cooperative spectrum sensing, various evolutionary algorithms have been discussed, which optimize the weight coefficient vector for maximizing the detection performance. In this paper, we propose the hybrid invasive weed optimization and particle swarm optimization (IWO/PSO) algorithm as a fast and global optimization method, which improves the detection probability with a lesser sensing time. Then, the efficiency of this algorithm is compared with the standard invasive weed optimization (IWO), particle swarm optimization (PSO), genetic algorithm (GA) and other conventional SDF based methods on the basis of convergence and detection probability.Keywords: cognitive radio, spectrum sensing, soft decision fusion, GA, PSO, IWO, hybrid IWO/PSO
Procedia PDF Downloads 46910513 Ensemble-Based SVM Classification Approach for miRNA Prediction
Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam
Abstract:
In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data
Procedia PDF Downloads 34910512 Several Aspects of the Conceptual Framework of Financial Reporting
Authors: Nadezhda Kvatashidze
Abstract:
The conceptual framework of International Financial Reporting Standards determines the basic principles of accounting. The said principles have multiple applications, with professional judgments being one of those. Recognition and assessment of the information contained in financial reporting, especially so the somewhat uncertain events and transactions and/or the ones regarding which there is no standard or interpretation are based on professional judgments. Professional judgments aim at the formulation of expert assumptions regarding the specifics of the circumstances and events to be entered into the report based on the conceptual framework terms and principles. Experts have to make a choice in favor of one of the aforesaid and simulate the situations applying multi-variant accounting estimates and judgment. In making the choice, one should consider all the factors, which may help represent the information in the best way possible. Professional judgment determines the relevance and faithful representation of the presented information, which makes it more useful for the existing and potential investors. In order to assess the prospected net cash flows, the information must be predictable and reliable. The publication contains critical analysis of the aforementioned problems. The fact that the International Financial Reporting Standards are developed continuously makes the issue all the more important and that is another point discussed in the study.Keywords: conceptual framework, faithful representation, professional judgement, relevance
Procedia PDF Downloads 215