Search results for: incidental information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13341

Search results for: incidental information processing

12261 Artificial Intelligence in Enterprise Information Systems: A Review

Authors: Danah S. Alabdulmohsin

Abstract:

Due to the fast growth of organizational data as well as the emergence of new technologies such as artificial intelligence (AI), organizations tend to utilize these new technologies in their enterprise information systems (EIS) either to overcome the issues they struggle with or to enhance their functions. The aim of this paper is to review the potential role of AI technologies in EIS, namely: enterprise resource planning systems (ERP), customer relation management systems (CRM), supply chain management systems (SCM), knowledge systems (KM), and human resources management systems (HRM). The paper provided the definitions of these systems as well as the definitions of AI technologies that have been used in EIS. In addition, the paper discussed the challenges that organizations might face while integrating AI with their information systems and explained why some organizations fail in achieving successful implementations of the integration.

Keywords: artificial intelligence, AI, enterprise information system, EIS, integration

Procedia PDF Downloads 97
12260 From News Breakers to News Followers: The Influence of Facebook on the Coverage of the January 2010 Crisis in Jos

Authors: T. Obateru, Samuel Olaniran

Abstract:

In an era when the new media is affording easy access to packaging and dissemination of information, the social media have become a popular avenue for sharing information for good or ill. It is evident that the traditional role of journalists as ‘news breakers’ is fast being eroded. People now share information on happenings via the social media like Facebook, Twitter and the rest, such that journalists themselves now get leads on happenings from such sources. Beyond the access to information provided by the new media is the erosion of the gatekeeping role of journalists who by their training and calling, are supposed to handle information with responsibility. Thus, sensitive information that journalists would normally filter is randomly shared by social media activists. This was the experience of journalists in Jos, Plateau State in January 2010 when another of the recurring ethnoreligious crisis that engulfed the state resulted in another widespread killing, vandalism, looting, and displacements. Considered as one of the high points of crises in the state, journalists who had the duty of covering the crisis also relied on some of these sources to get their bearing on the violence. This paper examined the role of Facebook in the work of journalists who covered the 2010 crisis. Taking the gatekeeping perspective, it interrogated the extent to which Facebook impacted their professional duty positively or negatively vis-à-vis the peace journalism model. It employed survey to elicit information from 50 journalists who covered the crisis using questionnaire as instrument. The paper revealed that the dissemination of hate information via mobile phones and social media, especially Facebook, aggravated the crisis situation. Journalists became news followers rather than news breakers because a lot of them were put on their toes by information (many of which were inaccurate or false) circulated on Facebook. It recommended that journalists must remain true to their calling by upholding their ‘gatekeeping’ role of disseminating only accurate and responsible information if they would remain the main source of credible information on which their audience rely.

Keywords: crisis, ethnoreligious, Facebook, journalists

Procedia PDF Downloads 294
12259 Studying the Effects of Conditional Conservatism and Lack of Information Asymmetry on the Cost of Capital of the Accepted Companies in Tehran Stock Exchange

Authors: Fayaz Moosavi, Saeid Moradyfard

Abstract:

One of the methods in avoiding management fraud and increasing the quality of financial information, is the notification of qualitative features of financial information, including conservatism characteristic. Although taking a conservatism approach, while boosting the quality of financial information, is able to reduce the informational risk and the cost of capital stock of commercial department, by presenting an improper image about the situation of the commercial department, raises the risk of failure in returning the main and capital interest, and consequently the cost of capital of the commercial department. In order to know if conservatism finally leads to the increase or decrease of the cost of capital or does not have any influence on it, information regarding accepted companies in Tehran stock exchange is utilized by application of pooling method from 2007 to 2012 and it included 124 companies. The results of the study revealed that there is an opposite and meaningful relationship between conditional conservatism and the cost of capital of the company. In other words, if bad and unsuitable news and signs are reflected sooner than good news in accounting profit, the cost of capital of the company increases. In addition, there is a positive and meaningful relationship between the cost of capital and lack of information asymmetry.

Keywords: conditional conservatism, lack of information asymmetry, the cost of capital, stock exchange

Procedia PDF Downloads 265
12258 Dynamic Foot Pressure Measurement System Using Optical Sensors

Authors: Tanapon Keatsamarn, Chuchart Pintavirooj

Abstract:

Foot pressure measurement provides necessary information for diagnosis diseases, foot insole design, disorder prevention and other application. In this paper, dynamic foot pressure measurement is presented for pressure measuring with high resolution and accuracy. The dynamic foot pressure measurement system consists of hardware and software system. The hardware system uses a transparent acrylic plate and uses steel as the base. The glossy white paper is placed on the top of the transparent acrylic plate and covering with a black acrylic on the system to block external light. Lighting from LED strip entering around the transparent acrylic plate. The optical sensors, the digital cameras, are underneath the acrylic plate facing upwards. They have connected with software system to process and record foot pressure video in avi file. Visual Studio 2017 is used for software system using OpenCV library.

Keywords: foot, foot pressure, image processing, optical sensors

Procedia PDF Downloads 248
12257 In Situ Volume Imaging of Cleared Mice Seminiferous Tubules Opens New Window to Study Spermatogenic Process in 3D

Authors: Lukas Ded

Abstract:

Studying the tissue structure and histogenesis in the natural, 3D context is challenging but highly beneficial process. Contrary to classical approach of the physical tissue sectioning and subsequent imaging, it enables to study the relationships of individual cellular and histological structures in their native context. Recent developments in the tissue clearing approaches and microscopic volume imaging/data processing enable the application of these methods also in the areas of developmental and reproductive biology. Here, using the CLARITY tissue procedure and 3D confocal volume imaging we optimized the protocol for clearing, staining and imaging of the mice seminiferous tubules isolated from the testes without cardiac perfusion procedure. Our approach enables the high magnification and fine resolution axial imaging of the whole diameter of the seminiferous tubules with possible unlimited lateral length imaging. Hence, the large continuous pieces of the seminiferous tubule can be scanned and digitally reconstructed for the study of the single tubule seminiferous stages using nuclear dyes. Furthermore, the application of the antibodies and various molecular dyes can be used for molecular labeling of individual cellular and subcellular structures and resulting 3D images can highly increase our understanding of the spatiotemporal aspects of the seminiferous tubules development and sperm ultrastructure formation. Finally, our newly developed algorithms for 3D data processing enable the massive parallel processing of the large amount of individual cell and tissue fluorescent signatures and building the robust spermatogenic models under physiological and pathological conditions.

Keywords: CLARITY, spermatogenesis, testis, tissue clearing, volume imaging

Procedia PDF Downloads 136
12256 Mobile Microscope for the Detection of Pathogenic Cells Using Image Processing

Authors: P. S. Surya Meghana, K. Lingeshwaran, C. Kannan, V. Raghavendran, C. Priya

Abstract:

One of the most basic and powerful tools in all of science and medicine is the light microscope, the fundamental device for laboratory as well as research purposes. With the improving technology, the need for portable, economic and user-friendly instruments is in high demand. The conventional microscope fails to live up to the emerging trend. Also, adequate access to healthcare is not widely available, especially in developing countries. The most basic step towards the curing of a malady is the diagnosis of the disease itself. The main aim of this paper is to diagnose Malaria with the most common device, cell phones, which prove to be the immediate solution for most of the modern day needs with the development of wireless infrastructure allowing to compute and communicate on the move. This opened up the opportunity to develop novel imaging, sensing, and diagnostics platforms using mobile phones as an underlying platform to address the global demand for accurate, sensitive, cost-effective, and field-portable measurement devices for use in remote and resource-limited settings around the world.

Keywords: cellular, hand-held, health care, image processing, malarial parasites, microscope

Procedia PDF Downloads 267
12255 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images

Authors: A. Nachour, L. Ouzizi, Y. Aoura

Abstract:

Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.

Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution

Procedia PDF Downloads 391
12254 Enhancing Cybersecurity Protective Behaviour: Role of Information Security Competencies and Procedural Information Security Countermeasure Awareness

Authors: Norshima Humaidi, Saif Hussein Abdallah Alghazo

Abstract:

Cybersecurity threat have become a serious issue recently, and one of the cause is because human error, which is usually constituted by carelessness, ignorance, and failure to practice cybersecurity behaviour adequately. Using a data from a quantitative survey, Partial Least Squares-Structural Equation Modelling (PLS-SEM) analysis was used to determine the factors that affect cybersecurity protective behaviour (CPB). This study adapts cybersecurity protective behaviour model by focusing on two constructs that can enhance CPB: manager’s information security competencies (MISI) and procedural information security countermeasure (PCM) awareness. Theory of leadership competencies were adapted to measure user’s perception towards competencies among security managers/leader in the organization. Confirmatory factor analysis (CFA) testing shows that all the measurement items of each constructs were adequate in their validity individually based on their factor loading value. Moreover, each constructs are valid based on their parameter estimates and statistical significance. The quantitative research findings show that PCM awareness strongly influences CPB compared to MISI. Meanwhile, MISI was significantlyPCM awarenss. This study believes that the research findings can contribute to human behaviour in IS studies and are particularly beneficial to policy makers in improving organizations’ strategic plans in information security, especially in this new era. Most organizations spend time and resources to provide and establish strategic plans of information security; however, if employees are not willing to comply and practice information security behaviour appropriately, then these efforts are in vain.

Keywords: cybersecurity, protection behaviour, information security, information security competencies, countermeasure awareness

Procedia PDF Downloads 95
12253 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 39
12252 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 156
12251 Investigating the Effect of Orthographic Transparency on Phonological Awareness in Bilingual Children with Dyslexia

Authors: Sruthi Raveendran

Abstract:

Developmental dyslexia, characterized by reading difficulties despite normal intelligence, presents a significant challenge for bilingual children navigating languages with varying degrees of orthographic transparency. This study bridges a critical gap in dyslexia interventions for bilingual populations in India by examining how consistency and predictability of letter-sound relationships in a writing system (orthographic transparency) influence the ability to understand and manipulate the building blocks of sound in language (phonological processing). The study employed a computerized visual rhyme-judgment task with concurrent EEG (electroencephalogram) recording. The task compared reaction times, accuracy of performance, and event-related potential (ERP) components (N170, N400, and LPC) for rhyming and non-rhyming stimuli in two orthographies: English (opaque orthography) and Kannada (transparent orthography). As hypothesized, the results revealed advantages in phonological processing tasks for transparent orthography (Kannada). Children with dyslexia were faster and more accurate when judging rhymes in Kannada compared to English. This suggests that a language with consistent letter-sound relationships (transparent orthography) facilitates processing, especially for tasks that involve manipulating sounds within words (rhyming). Furthermore, brain activity measured by event-related potentials (ERP) showed less effort required for processing words in Kannada, as reflected by smaller N170, N400, and LPC amplitudes. These findings highlight the crucial role of orthographic transparency in optimizing reading performance for bilingual children with dyslexia. These findings emphasize the need for language-specific intervention strategies that consider the unique linguistic characteristics of each language. While acknowledging the complexity of factors influencing dyslexia, this research contributes valuable insights into the impact of orthographic transparency on phonological awareness in bilingual children. This knowledge paves the way for developing tailored interventions that promote linguistic inclusivity and optimize literacy outcomes for children with dyslexia.

Keywords: developmental dyslexia, phonological awareness, rhyme judgment, orthographic transparency, Kannada, English, N170, N400, LPC

Procedia PDF Downloads 9
12250 Globalization as Instrument for Multi-National Corporation in Transforming Asian’s Perspective towards Clean Water Consumption

Authors: Atanta Gian

Abstract:

It is inevitable that globalization has succeeded in transforming the world today. The influence of globalization has emerged in almost every aspect of life nowadays, especially in shaping the perception of the people. It can be seen on how easy for people are affected by the information surrounding them. Due to globalization, the flow of information has become more rapid along with the development of technology. People tend to believe in information that they actually get by themselves, if there is information where most of the people believe it is true, then this information could be categorized as factual and relevant. Therefore if people gain information on what is best for them in terms of daily consumption, then this information could transform their perspective, and it becomes a consideration in selecting their needs for daily consumption. By looking at this trend, the author sees that globalization could be used by Multi-National Corporation (MNC) to enhance the promotion of their products. This is applied by shaping the perspectives of the world regarding what is the best for them. Multi-National Corporation which has better technology in terms of the development of their external promotion could utilize this opportunity to affect people’s perspectives into what they want. In this paper, the author would like to elaborate how globalization is applied by MNC to shape people’s perspective regarding what is the best for them. The author would utilize a case study to analyze on how MNC could transform the perspectives of Asian people regarding the necessary of having a better quality drinking water, which in this case, MNC has shaped the perspective of Asian people in choosing their product by promoting the bottled water as the best choice for them. In the end of this paper, author would come to a conclusion that MNCs are able to shape the world’s perspective regarding the needs of their products which is supported by the globalization that is happening now.

Keywords: consumption, globalisation, influence, information technology, multi-national corporations

Procedia PDF Downloads 209
12249 Audio-Visual Co-Data Processing Pipeline

Authors: Rita Chattopadhyay, Vivek Anand Thoutam

Abstract:

Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.

Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech

Procedia PDF Downloads 80
12248 Quantifying Meaning in Biological Systems

Authors: Richard L. Summers

Abstract:

The advanced computational analysis of biological systems is becoming increasingly dependent upon an understanding of the information-theoretic structure of the materials, energy and interactive processes that comprise those systems. The stability and survival of these living systems are fundamentally contingent upon their ability to acquire and process the meaning of information concerning the physical state of its biological continuum (biocontinuum). The drive for adaptive system reconciliation of a divergence from steady-state within this biocontinuum can be described by an information metric-based formulation of the process for actionable knowledge acquisition that incorporates the axiomatic inference of Kullback-Leibler information minimization driven by survival replicator dynamics. If the mathematical expression of this process is the Lagrangian integrand for any change within the biocontinuum then it can also be considered as an action functional for the living system. In the direct method of Lyapunov, such a summarizing mathematical formulation of global system behavior based on the driving forces of energy currents and constraints within the system can serve as a platform for the analysis of stability. As the system evolves in time in response to biocontinuum perturbations, the summarizing function then conveys information about its overall stability. This stability information portends survival and therefore has absolute existential meaning for the living system. The first derivative of the Lyapunov energy information function will have a negative trajectory toward a system's steady state if the driving force is dissipating. By contrast, system instability leading to system dissolution will have a positive trajectory. The direction and magnitude of the vector for the trajectory then serves as a quantifiable signature of the meaning associated with the living system’s stability information, homeostasis and survival potential.

Keywords: meaning, information, Lyapunov, living systems

Procedia PDF Downloads 131
12247 A Query Optimization Strategy for Autonomous Distributed Database Systems

Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam

Abstract:

Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.

Keywords: autonomous strategies, distributed database systems, high priority, query optimization

Procedia PDF Downloads 524
12246 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search

Authors: Wenbo Wang, Yi-Fang Brook Wu

Abstract:

The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.

Keywords: fact checking, claim verification, deep learning, natural language processing

Procedia PDF Downloads 62
12245 Transforming Healthcare with Immersive Visualization: An Analysis of Virtual and Holographic Health Information Platforms

Authors: Hossein Miri, Zhou YongQi, Chan Bormei-Suy

Abstract:

The development of advanced technologies and innovative solutions has opened up exciting new possibilities for revolutionizing healthcare systems. One such emerging concept is the use of virtual and holographic health information platforms that aim to provide interactive and personalized medical information to users. This paper provides a review of notable virtual and holographic health information platforms. It begins by highlighting the need for information visualization and 3D representation in healthcare. It then proceeds to provide background knowledge on information visualization and historical developments in 3D visualization technology. Additional domain knowledge concerning holography, holographic computing, and mixed reality is then introduced, followed by highlighting some of their common applications and use cases. After setting the scene and defining the context, the need and importance of virtual and holographic visualization in medicine are discussed. Subsequently, some of the current research areas and applications of digital holography and holographic technology are explored, alongside the importance and role of virtual and holographic visualization in genetics and genomics. An analysis of the key principles and concepts underlying virtual and holographic health information systems is presented, as well as their potential implications for healthcare are pointed out. The paper concludes by examining the most notable existing mixed-reality applications and systems that help doctors visualize diagnostic and genetic data and assist in patient education and communication. This paper is intended to be a valuable resource for researchers, developers, and healthcare professionals who are interested in the use of virtual and holographic technologies to improve healthcare.

Keywords: virtual, holographic, health information platform, personalized interactive medical information

Procedia PDF Downloads 89
12244 Retaining Users in a Commercially-Supported Social Network

Authors: Sasiphan Nitayaprapha

Abstract:

A commercially-supported social network has become an emerging channel for an organization to communicate with and provide services to customers. The success of the commercially-supported social network depends on the ability of the organization to keep the customers in participating in the network. Drawing from the theories of information adoption, information systems continuance, and web usability, the author develops a model to explore how a commercially-supported social network can encourage customers to continue participating and using the information in the network. The theoretical model will be proved through an online survey of customers using the commercially-supported social networking sites of several high technology companies operating in the same sector. The result will be compared with previous studies to learn about the explanatory power of the research model, and to identify the main factors determining users’ intention to continue using a commercially-supported social network. Theoretical and practical implications, and limitations are discussed.

Keywords: social network, information adoption, information systems continuance, web usability, user satisfaction

Procedia PDF Downloads 316
12243 Investigation of the Unbiased Characteristic of Doppler Frequency to Different Antenna Array Geometries

Authors: Somayeh Komeylian

Abstract:

Array signal processing techniques have been recently developing in a variety application of the performance enhancement of receivers by refraining the power of jamming and interference signals. In this scenario, biases induced to the antenna array receiver degrade significantly the accurate estimation of the carrier phase. Owing to the integration of frequency becomes the carrier phase, we have obtained the unbiased doppler frequency for the high precision estimation of carrier phase. The unbiased characteristic of Doppler frequency to the power jamming and the other interference signals allows achieving the highly accurate estimation of phase carrier. In this study, we have rigorously investigated the unbiased characteristic of Doppler frequency to the variation of the antenna array geometries. The simulation results have efficiently verified that the Doppler frequency remains also unbiased and accurate to the variation of antenna array geometries.

Keywords: array signal processing, unbiased doppler frequency, GNSS, carrier phase, and slowly fluctuating point target

Procedia PDF Downloads 159
12242 Lamb Waves Wireless Communication in Healthy Plates Using Coherent Demodulation

Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad

Abstract:

Guided ultrasonic waves are used in Non-Destructive Testing (NDT) and Structural Health Monitoring (SHM) for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average Bit Error Rate. Results have shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.

Keywords: lamb waves communication, wireless communication, coherent demodulation, bit error rate

Procedia PDF Downloads 260
12241 An Information System for Strategic Performance Scoring in Municipal Management

Authors: Emin Gundogar, Aysegul Yilmaz

Abstract:

Strategic performance scoring is a significant procedure in management. There are various methods to improve this procedure. This study introduces an information system that is developed to score performance for municipal management. The application of the system is clarified by exemplifying municipal processes.

Keywords: management information system, municipal management, performance scoring

Procedia PDF Downloads 769
12240 Designing a Tool for Software Maintenance

Authors: Amir Ngah, Masita Abdul Jalil, Zailani Abdullah

Abstract:

The aim of software maintenance is to maintain the software system in accordance with advancement in software and hardware technology. One of the early works on software maintenance is to extract information at higher level of abstraction. In this paper, we present the process of how to design an information extraction tool for software maintenance. The tool can extract the basic information from old program such as about variables, based classes, derived classes, objects of classes, and functions. The tool have two main part; the lexical analyzer module that can read the input file character by character, and the searching module which is user can get the basic information from existing program. We implemented this tool for a patterned sub-C++ language as an input file.

Keywords: extraction tool, software maintenance, reverse engineering, C++

Procedia PDF Downloads 492
12239 A Preliminary Study for Building an Arabic Corpus of Pair Questions-Texts from the Web: Aqa-Webcorp

Authors: Wided Bakari, Patrce Bellot, Mahmoud Neji

Abstract:

With the development of electronic media and the heterogeneity of Arabic data on the Web, the idea of building a clean corpus for certain applications of natural language processing, including machine translation, information retrieval, question answer, become more and more pressing. In this manuscript, we seek to create and develop our own corpus of pair’s questions-texts. This constitution then will provide a better base for our experimentation step. Thus, we try to model this constitution by a method for Arabic insofar as it recovers texts from the web that could prove to be answers to our factual questions. To do this, we had to develop a java script that can extract from a given query a list of html pages. Then clean these pages to the extent of having a database of texts and a corpus of pair’s question-texts. In addition, we give preliminary results of our proposal method. Some investigations for the construction of Arabic corpus are also presented in this document.

Keywords: Arabic, web, corpus, search engine, URL, question, corpus building, script, Google, html, txt

Procedia PDF Downloads 323
12238 Preparation on Sentimental Analysis on Social Media Comments with Bidirectional Long Short-Term Memory Gated Recurrent Unit and Model Glove in Portuguese

Authors: Leonardo Alfredo Mendoza, Cristian Munoz, Marco Aurelio Pacheco, Manoela Kohler, Evelyn Batista, Rodrigo Moura

Abstract:

Natural Language Processing (NLP) techniques are increasingly more powerful to be able to interpret the feelings and reactions of a person to a product or service. Sentiment analysis has become a fundamental tool for this interpretation but has few applications in languages other than English. This paper presents a classification of sentiment analysis in Portuguese with a base of comments from social networks in Portuguese. A word embedding's representation was used with a 50-Dimension GloVe pre-trained model, generated through a corpus completely in Portuguese. To generate this classification, the bidirectional long short-term memory and bidirectional Gated Recurrent Unit (GRU) models are used, reaching results of 99.1%.

Keywords: natural processing language, sentiment analysis, bidirectional long short-term memory, BI-LSTM, gated recurrent unit, GRU

Procedia PDF Downloads 159
12237 Thermal Image Segmentation Method for Stratification of Freezing Temperatures

Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka

Abstract:

The study uses an image analysis technique employing thermal imaging to measure the percentage of areas with various temperatures on a freezing surface. An image segmentation method using threshold values is applied to a sequence of image recording the freezing process. The phenomenon is transient and temperatures vary fast to reach the freezing point and complete the freezing process. Freezing salt water is subjected to the salt rejection that makes the freezing point dynamic and dependent on the salinity at the phase interface. For a specific area of freezing, nucleation starts from one side and end to another side, which causes a dynamic and transient temperature in that area. Thermal cameras are able to reveal a difference in temperature due to their sensitivity to infrared radiance. Using Experimental setup, a video is recorded by a thermal camera to monitor radiance and temperatures during the freezing process. Image processing techniques are applied to all frames to detect and classify temperatures on the surface. Image processing segmentation method is used to find contours with same temperatures on the icing surface. Each segment is obtained using the temperature range appeared in the image and correspond pixel values in the image. Using the contours extracted from image and camera parameters, stratified areas with different temperatures are calculated. To observe temperature contours on the icing surface using the thermal camera, the salt water sample is dropped on a cold surface with the temperature of -20°C. A thermal video is recorded for 2 minutes to observe the temperature field. Examining the results obtained by the method and the experimental observations verifies the accuracy and applicability of the method.

Keywords: ice contour boundary, image processing, image segmentation, salt ice, thermal image

Procedia PDF Downloads 321
12236 Rheological Properties of Cellulose/TBAF/DMSO Solutions and Their Application to Fabrication of Cellulose Hydrogel

Authors: Deokyeong Choe, Jae Eun Nam, Young Hoon Roh, Chul Soo Shin

Abstract:

The development of hydrogels with a high mechanical strength is important for numerous applications of hydrogels. As a material for tough hydrogels, cellulose has attracted much interest. However, cellulose cannot be melted and is very difficult to be dissolved in most solvents. Therefore, its dissolution in tetrabutylammonium fluoride/dimethyl sulfoxide (TBAF/DMSO) solvents has attracted researchers for chemical processing of cellulose. For this reason, studies about rheological properties of cellulose/TBAF/DMSO solution will provide useful information. In this study, viscosities of cellulose solutions prepared using different amounts of cellulose and TBAF in DMSO were measured. As expected, the viscosity of cellulose solution decreased with respect to the increasing volume of DMSO. The most viscose cellulose solution was achieved at a 1:1 mass ratio of cellulose to TBAF regardless of their contents in DMSO. At a 1:1 mass ratio of cellulose to TBAF, the formation of cellulose nanoparticles (467 nm) resulted in a dramatic increase in the viscosity, which led to the fabrication of 3D cellulose hydrogels.

Keywords: cellulose, TBAF/DMSO, viscosity, hydrogel

Procedia PDF Downloads 253
12235 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery

Authors: Evans Belly, Imdad Rizvi, M. M. Kadam

Abstract:

Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.

Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery

Procedia PDF Downloads 314
12234 Evaluation of Mechanical Properties and Analysis of Rapidly Heat Treated M-42 High Speed Steel

Authors: R. N. Karthik Babu, R. Sarvesh, A. Rajendra Prasad, G. Swaminathan

Abstract:

M42 is a molybdenum-series high-speed alloy steel widely used because of its better hot-hardness and wear resistance. These steels are conventionally heat treated in a salt bath furnace with up to three stages of preheating with predetermined soaking and holding periods. Such methods often involve long periods of processing with a large amount of energy consumed. In this study, the M42 steel samples were heat-treated by rapidly heating the specimens to the austenising temperature of 1260 °C and cooled conventionally by quenching in a neutral salt bath at a temperature of 550 °C with the aid of a hybrid microwave furnace. As metals reflect microwaves, they cannot directly be heated up when placed in a microwave furnace. The technology used herein requires the specimens to be placed in a crucible lined with SiC which is a good absorber of microwaves and the SiC lining heats the metal through radiation which facilitates the volumetric heating of the metal. A sample of similar dimensions was heat treated conventionally and cooled in the same manner. Conventional tempering process was then carried out on both these samples and analysed for various parameters such as micro-hardness, processing time, etc. Microstructure analysis and scanning electron microscopy was also carried out. The objective of the study being that similar or better properties, with substantial time and energy saving and cost cutting are achievable by rapid heat treatment through hybrid microwave furnaces. It is observed that the heat treatment is done with substantial time and energy savings, and also with minute improvement in mechanical properties of the tool steel heat treated.

Keywords: rapid heating, heat treatment, metal processing, microwave heating

Procedia PDF Downloads 286
12233 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment

Authors: Ritsuko Kawasaki, Takeshi Hiromatsu

Abstract:

Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.

Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization

Procedia PDF Downloads 379
12232 Price Setting and the Role of Accounting Information

Authors: Chris Durden, Peter Lane

Abstract:

Cost accounting information potentially plays an important role in price setting. According to prior research fixed and variable cost information often is a key influence on pricing decisions. The literature highlights the benefits of applying systematic costing systems for enhanced price setting processes. This paper explores how costing systems are used for pricing decisions in the tourism and hospitality industry relative to other sources of price setting information. Pricing based on full cost information was found to have relatively greater importance and short-term survival and customer oriented objectives were found to be the more important pricing objectives. This paper contributes to the literature by providing a recent analysis of accounting’s role in price setting within the tourism and hospitality industry.

Keywords: cost accounting systems, pricing decisions, cost-plus pricing, market pricing, tourism industry

Procedia PDF Downloads 387