Search results for: minority forms of information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16083

Search results for: minority forms of information processing

14073 Outline of a Technique for the Recommendation of Tourism Products in Cuba Using GIS

Authors: Jesse D. Cano, Marlon J. Remedios

Abstract:

Cuban tourism has developed so much in the last 30 years to the point of becoming one of the engines of the Cuban economy. With such a development, Cuban companies opting for e-tourism as a way to publicize their products and attract customers has also grown. Despite this fact, the majority of Cuban tourism-themed websites simply provide information on the different products and services they offer which results in many cases, in the user getting overwhelmed with the amount of information available which results in the user abandoning the search before he can find a product that fits his needs. Customization has been recognized as a critical factor for successful electronic tourism business and the use of recommender systems is the best approach to address the problem of personalization. This paper aims to outline a preliminary technique to obtain predictions about which products a particular user would give a better evaluation; these products would be those which the website would show in the first place. To achieve this, the theoretical elements of the Cuban tourism environment are discussed; recommendation systems and geographic information systems as tools for information representation are also discussed. Finally, for each structural component identified, we define a set of rules that allows obtaining an electronic tourism system that handles the personalization of the service provided effectively.

Keywords: geographic information system, technique, tourism products, recommendation

Procedia PDF Downloads 503
14072 Response of Broiler Chickens Fed Pelleted or Non-Pelleted Diets, Containing Graded Levels of Raw Full-Fat Soybean

Authors: G. Berhane, F. Kebede

Abstract:

A feeding trial was conducted to enhance the utilization of locally produced full-fat soybean by the broiler industry. The study had three phases such as starter (1-14d), grower (15–28d), and finisher (29–49d) phases. A completely randomized design (CRD) was used in the starter phase with three treatments (commercial soybean meal (SBM) was replaced by raw full-fat soybean (RFSB) at 0, 10, or 20%), and each was replicated eight times. A total of 408 unsexed one-day-old Cobb-500 broiler chicks were randomly allocated to replicates. A 2 x 3 factorial arrangement was used in both second (grower) and third (finisher) phase trials, which had six experimental diets. These six treatments were formed by dividing the original three diets (containing 0, 10, or 20% of RFSB into two and then by pelleting anyone from each respective group and leaving the other as mash. Every treatment had four replications and 17 birds in each. Chemical compositions of feed ingredients were analyzed, and data on the initial body weight of chicks, feed offered, feed leftover, body weight (BW) of chickens, and mortality were collected. At the end of the experiment, two birds (one male and one female) per replicate were randomly selected and humanly slaughtered. Weights of dressed, eviscerated, cut parts of the carcass and visceral organs were weighed and recorded. Results indicated that feed intake (FI), body weight gain (BWG), BW, and feed conversion ratio (FCR) of broilers were not significantly affected (P=0.05) by supplementation of a leveled RFSB on diets at starter, grower, and finisher phases. The FI at the finisher stage was also significantly (P=0.05) influenced by the feed forms. However, weights of dressed, eviscerated, cut parts of the carcass and visceral organs were not significantly (P=0.05) affected by both RFSB supplementation, up to 20%, and feed forms. It is concluded that commercial SBM can be replaced by locally produced RFSB up to 20% without pelleting the diets.

Keywords: broilers, carcass characteristics, raw full-fat soybean, weight gain

Procedia PDF Downloads 143
14071 Analysis of Information Sharing and Capacity Constraint on Backlog Bullwhip Effect in Two Level Supply Chain

Authors: Matloub Hussaina

Abstract:

This paper investigates the impact of information sharing and capacity constraints on backlog bullwhip effect of Automatic Pipe Line Inventory and Order Based Production Control System (APIOBPCS). System dynamic simulation using iThink Software has been applied. It has been found that smooth ordering by Tier 1 can be achieved when Tier 1 has medium capacity constraints. Simulation experiments also show that information sharing helps to reduce 50% of backlog bullwhip effect in capacitated supply chains. This knowledge is of value per se, giving supply chain operations managers and designers a practical way in to controlling the backlog bullwhip effect. Future work should investigate the total cost implications of capacity constraints and safety stocks in multi-echelon supply chain.

Keywords: supply chain dynamics, information sharing, capacity constraints, simulation, APIOBPCS

Procedia PDF Downloads 318
14070 Design Criteria for an Internal Information Technology Cost Allocation to Support Business Information Technology Alignment

Authors: Andrea Schnabl, Mario Bernhart

Abstract:

The controlling instrument of an internal cost allocation (IT chargeback) is commonly used to make IT costs transparent and controllable. Information Technology (IT) became, especially for information industries, a central competitive factor. Consequently, the focus is not on minimizing IT costs but on the strategic aligned application of IT. Hence, an internal IT cost allocation should be designed to enhance the business-IT alignment (strategic alignment of IT) in order to support the effective application of IT from a company’s point of view. To identify design criteria for an internal cost allocation to support business alignment a case study analysis at a typical medium-sized firm in information industry is performed. Documents, Key Performance Indicators, and cost accounting data over a period of 10 years are analyzed and interviews are performed. The derived design criteria are evaluated by 6 heads of IT departments from 6 different companies, which have an internal IT cost allocation at use. By applying these design criteria an internal cost allocation serves not only for cost controlling but also as an instrument in strategic IT management.

Keywords: accounting for IT services, Business IT Alignment, internal cost allocation, IT controlling, IT governance, strategic IT management

Procedia PDF Downloads 155
14069 FPGA Implementation of the BB84 Protocol

Authors: Jaouadi Ikram, Machhout Mohsen

Abstract:

The development of a quantum key distribution (QKD) system on a field-programmable gate array (FPGA) platform is the subject of this paper. A quantum cryptographic protocol is designed based on the properties of quantum information and the characteristics of FPGAs. The proposed protocol performs key extraction, reconciliation, error correction, and privacy amplification tasks to generate a perfectly secret final key. We modeled the presence of the spy in our system with a strategy to reveal some of the exchanged information without being noticed. Using an FPGA card with a 100 MHz clock frequency, we have demonstrated the evolution of the error rate as well as the amounts of mutual information (between the two interlocutors and that of the spy) passing from one step to another in the key generation process.

Keywords: QKD, BB84, protocol, cryptography, FPGA, key, security, communication

Procedia PDF Downloads 183
14068 Comparative Efficacy of Gas Phase Sanitizers for Inactivating Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes on Intact Lettuce Heads

Authors: Kayla Murray, Andrew Green, Gopi Paliyath, Keith Warriner

Abstract:

Introduction: It is now acknowledged that control of human pathogens associated with fresh produce requires an integrated approach of several interventions as opposed to relying on post-harvest washes to remove field acquired contamination. To this end, current research is directed towards identifying such interventions that can be applied at different points in leafy green processing. Purpose: In the following the efficacy of different gas phase treatments to decontaminate whole lettuce heads during pre-processing storage were evaluated. Methods: Whole Cos lettuce heads were spot inoculated with L. monocytogenes, E. coli O157:H7 or Salmonella spp. The inoculated lettuce heads were then placed in a treatment chamber and exposed to ozone, chlorine dioxide or hydroxyl radicals at different time periods under a range of relative humidity. Survivors of the treatments were enumerated along with sensory analysis performed on the treated lettuce. Results: Ozone gas reduced L. monocytogenes by 2-log10 after ten-minutes of exposure with Salmonella and E. coli O157:H7 being decreased by 0.66 and 0.56-log cfu respectively. Chlorine dioxide gas treatment reduced L. monocytogenes and Salmonella on lettuce heads by 4 log cfu but only supported a 0.8 log cfu reduction in E. coli O157:H7 numbers. In comparison, hydroxyl radicals supported a 2.9 – 4.8 log cfu reduction of model human pathogens inoculated onto lettuce heads but required extended exposure times and relative humidity < 0.8. Significance: From the gas phase sanitizers tested, chlorine dioxide and hydroxyl radicals are the most effective. The latter process holds most promise based on the ease of delivery, worker safety and preservation of lettuce sensory characteristics. Although expose times for hydroxyl radicles was relatively long (24h) this should not be considered a limitation given the intervention is applied in store rooms or in transport containers during transit.

Keywords: gas phase sanitizers, iceberg lettuce heads, leafy green processing

Procedia PDF Downloads 408
14067 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English

Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista

Abstract:

The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.

Keywords: corpus linguistics, historical linguistics, old English, parallel corpus

Procedia PDF Downloads 212
14066 Communicating Through Symbolisms in Anthropoligical Medicine with Reference to Traditional Performances of Wayang Kulit, Main Puteri and Kuda Kepang

Authors: M. G. Nasuruddin, S. Ishak

Abstract:

In anthropological medicine (traditional therapeutic healing) symbolic interface are used to connect with the cognitive and metacognitive mechanisms to activate conscious and unconscious response of patients or other recipients. At the same time they are used to communicate with the inhabitants of the nether world to whom are ascribed almost all cases of psychosomatic illness. The symbols, which are cultural specific, are divided into verbal and non-verbal forms of communication. The verbal forms are chanting of mantra and doa and the invocation to invoke the spirits while the non-verbal ones are the physical materials such as the offerings, props and decorative elements, music, movements, olfactory sensation and the performance space. The process of communication through these symbols is affected by the Shaman who is a link or intermediary between the healer (Shaman) and the patients and between the healer and the spirits of the nether world. The paper also examines the scientific perspective of the traditional healing through the use of these symbols. The response to these symbols as external stimuli is embedded in the genes that are linked to the hereditary factor in the person’s DNA. When the patients are tuned in to external stimuli such as music, chanting and singing (sonic orders), it can triggers a response from the brain, which may activate its inner pharmacy by releasing drugs such as dopamine and/or opiodsto ameliorate pain and counter depression, anxiety and create a feel good feeling. These symbols act like placebo, evoking the power of the mind over the body and triggering the innate self-healing energy. At the same time they could also be used as nocebo, for example black magic, which has the opposite effect of placebo. In whatever capacity they operate these symbols, which are either visual or auditory, is an integral part of anthropological medicine. For they communicate and conjure emotional responses that are conducive to healing by activating the internal brain pharmacy.

Keywords: communication, healing, placebo, nacebo, symbol

Procedia PDF Downloads 442
14065 Human Identification and Detection of Suspicious Incidents Based on Outfit Colors: Image Processing Approach in CCTV Videos

Authors: Thilini M. Yatanwala

Abstract:

CCTV (Closed-Circuit-Television) Surveillance System is being used in public places over decades and a large variety of data is being produced every moment. However, most of the CCTV data is stored in isolation without having integrity. As a result, identification of the behavior of suspicious people along with their location has become strenuous. This research was conducted to acquire more accurate and reliable timely information from the CCTV video records. The implemented system can identify human objects in public places based on outfit colors. Inter-process communication technologies were used to implement the CCTV camera network to track people in the premises. The research was conducted in three stages and in the first stage human objects were filtered from other movable objects available in public places. In the second stage people were uniquely identified based on their outfit colors and in the third stage an individual was continuously tracked in the CCTV network. A face detection algorithm was implemented using cascade classifier based on the training model to detect human objects. HAAR feature based two-dimensional convolution operator was introduced to identify features of the human face such as region of eyes, region of nose and bridge of the nose based on darkness and lightness of facial area. In the second stage outfit colors of human objects were analyzed by dividing the area into upper left, upper right, lower left, lower right of the body. Mean color, mod color and standard deviation of each area were extracted as crucial factors to uniquely identify human object using histogram based approach. Color based measurements were written in to XML files and separate directories were maintained to store XML files related to each camera according to time stamp. As the third stage of the approach, inter-process communication techniques were used to implement an acknowledgement based CCTV camera network to continuously track individuals in a network of cameras. Real time analysis of XML files generated in each camera can determine the path of individual to monitor full activity sequence. Higher efficiency was achieved by sending and receiving acknowledgments only among adjacent cameras. Suspicious incidents such as a person staying in a sensitive area for a longer period or a person disappeared from the camera coverage can be detected in this approach. The system was tested for 150 people with the accuracy level of 82%. However, this approach was unable to produce expected results in the presence of group of people wearing similar type of outfits. This approach can be applied to any existing camera network without changing the physical arrangement of CCTV cameras. The study of human identification and suspicious incident detection using outfit color analysis can achieve higher level of accuracy and the project will be continued by integrating motion and gait feature analysis techniques to derive more information from CCTV videos.

Keywords: CCTV surveillance, human detection and identification, image processing, inter-process communication, security, suspicious detection

Procedia PDF Downloads 183
14064 The Role of Planning and Memory in the Navigational Ability

Authors: Greeshma Sharma, Sushil Chandra, Vijander Singh, Alok Prakash Mittal

Abstract:

Navigational ability requires spatial representation, planning, and memory. It covers three interdependent domains, i.e. cognitive and perceptual factors, neural information processing, and variability in brain microstructure. Many attempts have been made to see the role of spatial representation in the navigational ability, and the individual differences have been identified in the neural substrate. But, there is also a need to address the influence of planning, memory on navigational ability. The present study aims to evaluate relations of aforementioned factors in the navigational ability. Total 30 participants volunteered in the study of a virtual shopping complex and subsequently were classified into good and bad navigators based on their performances. The result showed that planning ability was the most correlated factor for the navigational ability and also the discriminating factor between the good and bad navigators. There was also found the correlations between spatial memory recall and navigational ability. However, non-verbal episodic memory and spatial memory recall were also found to be correlated with the learning variable. This study attempts to identify differences between people with more and less navigational ability on the basis of planning and memory.

Keywords: memory, planning navigational ability, virtual reality

Procedia PDF Downloads 338
14063 Applications of Building Information Modeling (BIM) in Knowledge Sharing and Management in Construction

Authors: Shu-Hui Jan, Shih-Ping Ho, Hui-Ping Tserng

Abstract:

Construction knowledge can be referred to and reused among involved project managers and job-site engineers to alleviate problems on a construction job-site and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology to provide sharing of construction knowledge by using the Building Information Modeling (BIM) approach. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format, and facilitation of easy updating and transfer of information in the 3D BIM environment. Using the BIM approach, project managers and engineers can gain knowledge related to 3D BIM and obtain feedback provided by job-site engineers for future reference. This study addresses the application of knowledge sharing management in the construction phase of construction projects and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to verify the proposed methodology and demonstrate the effectiveness of sharing knowledge in the BIM environment. The combined results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM approach and web technology.

Keywords: construction knowledge management, building information modeling, project management, web-based information system

Procedia PDF Downloads 353
14062 Change in Value System: The Way Forward for Africa

Authors: Awe Ayodeji Samson, Adeuja Yetunde Omowunmi

Abstract:

Corruption is a ‘monster’ that can consume a whole nation, continent and even the world if it is not destroyed while it is still immature; It grows in the mind of the people, takes over their thinking and guides their decision-making process. Corruption snowballs into socio-economic catastrophe that might be difficult to deal with. Corruption which is a disease of the mind can be alleviated in Africa and the world at large by transforming a Corruption-Prone Mind to a Corruption-Immune Mind and to achieve this, we have to change our value system because the use of anti-graft agencies alone is not enough. Therefore, we have to fight corruption from the inside and the outside. Value System is the principle of right and wrong that are accepted by an individual or a social group; the reviewing and reordering of our value system is the solution to the problem of corruption as proposed by this research because the African society has become a ‘Money and Power Driven Society’ where the ‘I am worth concept’ which is a problematic concept has created an ‘Aggressive Society’ with grasping and money-grabbing individuals. We place more priority on money and the display of opulence. Hence, this has led to a ‘Triangular Society’ where minority is lavishing in plenty and majority is gasping for little. The get rich quick syndrome, the ethnicity syndrome, weakened educational system are signs of the prevalence of corruption in Africa This research has analyzed role and impact of the change in our value system in the fight against corruption in Africa and has therefore proposed the change in our value system as the way forward in the fight against corruption in Africa.

Keywords: corruption-prone mind, corruption-immune mind, triangular society, aggressive society, money and power-driven society

Procedia PDF Downloads 313
14061 A Comparative Study of Natural Language Processing Models for Detecting Obfuscated Text

Authors: Rubén Valcarce-Álvarez, Francisco Jáñez-Martino, Rocío Alaiz-Rodríguez

Abstract:

Cybersecurity challenges, including scams, drug sales, the distribution of child sexual abuse material, fake news, and hate speech on both the surface and deep web, have significantly increased over the past decade. Users who post such content often employ strategies to evade detection by automated filters. Among these tactics, text obfuscation plays an essential role in deceiving detection systems. This approach involves modifying words to make them more difficult for automated systems to interpret while remaining sufficiently readable for human users. In this work, we aim at spotting obfuscated words and the employed techniques, such as leetspeak, word inversion, punctuation changes, and mixed techniques. We benchmark Named Entity Recognition (NER) using models from the BERT family as well as two large language models (LLMs), Llama and Mistral, on XX_NER_WordCamouflage dataset. Our experiments evaluate these models by comparing their precision, recall, F1 scores, and accuracy, both overall and for each individual class.

Keywords: natural language processing (NLP), text obfuscation, named entity recognition (NER), deep learning

Procedia PDF Downloads 3
14060 Novel Recommender Systems Using Hybrid CF and Social Network Information

Authors: Kyoung-Jae Kim

Abstract:

Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.

Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition

Procedia PDF Downloads 291
14059 Comparative Assessment of Geocell and Geogrid Reinforcement for Flexible Pavement: Numerical Parametric Study

Authors: Anjana R. Menon, Anjana Bhasi

Abstract:

Development of highways and railways play crucial role in a nation’s economic growth. While rigid concrete pavements are durable with high load bearing characteristics, growing economies mostly rely on flexible pavements which are easier in construction and more economical. The strength of flexible pavement is based on the strength of subgrade and load distribution characteristics of intermediate granular layers. In this scenario, to simultaneously meet economy and strength criteria, it is imperative to strengthen and stabilize the load transferring layers, namely subbase and base. Geosynthetic reinforcement in planar and cellular forms have been proven effective in improving soil stiffness and providing a stable load transfer platform. Studies have proven the relative superiority of cellular form-geocells over planar geosynthetic forms like geogrid, owing to the additional confinement of infill material and pocket effect arising from vertical deformation. Hence, the present study investigates the efficiency of geocells over single/multiple layer geogrid reinforcements by a series of three-dimensional model analyses of a flexible pavement section under a standard repetitive wheel load. The stress transfer mechanism and deformation profiles under various reinforcement configurations are also studied. Geocell reinforcement is observed to take up a higher proportion of stress caused by the traffic loads compared to single and double-layer geogrid reinforcements. The efficiency of single geogrid reinforcement reduces with an increase in embedment depth. The contribution of lower geogrid is insignificant in the case of the double-geogrid reinforced system.

Keywords: Geocell, Geogrid, Flexible Pavement, Repetitive Wheel Load, Numerical Analysis

Procedia PDF Downloads 75
14058 Evaluation of Fusion Sonar and Stereo Camera System for 3D Reconstruction of Underwater Archaeological Object

Authors: Yadpiroon Onmek, Jean Triboulet, Sebastien Druon, Bruno Jouvencel

Abstract:

The objective of this paper is to develop the 3D underwater reconstruction of archaeology object, which is based on the fusion between a sonar system and stereo camera system. The underwater images are obtained from a calibrated camera system. The multiples image pairs are input, and we first solve the problem of image processing by applying the well-known filter, therefore to improve the quality of underwater images. The features of interest between image pairs are selected by well-known methods: a FAST detector and FLANN descriptor. Subsequently, the RANSAC method is applied to reject outlier points. The putative inliers are matched by triangulation to produce the local sparse point clouds in 3D space, using a pinhole camera model and Euclidean distance estimation. The SFM technique is used to carry out the global sparse point clouds. Finally, the ICP method is used to fusion the sonar information with the stereo model. The final 3D models have a précised by measurement comparing with the real object.

Keywords: 3D reconstruction, archaeology, fusion, stereo system, sonar system, underwater

Procedia PDF Downloads 299
14057 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research

Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez

Abstract:

Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.

Keywords: action research, information security, information technology, methodological design, process virtualization, risk management

Procedia PDF Downloads 165
14056 Implications on Informed Consent of Information Available to Patients on the Internet Regarding Hip and Knee Osteoarthritis

Authors: R. W. Walker, J. M. Lynch, K. Anderson, R. G. Middleton

Abstract:

Hip and knee arthritis are two of the commonest conditions that result in elective orthopaedic outpatient referral. At clinic appointments advice given regarding lifestyle modifications or treatment options may not be fully understood by patients. The majority of patients now use the internet to research their condition and use this to inform their decision about treatments. This study assessed the quality of patient information regarding hip and knee arthritis. To assess the quality of patient information regarding knee and hip arthritis available on the internet. Two internet searches were carried out one month apart using the search terms “knee arthritis” and “hip arthritis” on Google, a search engine that accounts for over 90% or internet searches in the UK. Sites were evaluated using the DISCERN instrument, a validated tool for measuring the quality of consumer health information. The first 50 results for each search were analysed by two different observers and discrepancies in scores were reviewed by both observers together and a score was agreed upon. In total 200 search result websites were assessed, of which 84 fulfilled the inclusion criteria. 53% (n=44) were funded directly by commercial healthcare businesses and of these, 70% (n=31) were funded by a surgeon/hospital promoting end-user purchase of surgical intervention. Overall 35% (n=29) websites were “for-profit” information websites where funding was from advertising revenues from pharmaceutical and prosthesis companies. 81% (n=67) offered information about surgical treatments however only 43% (n=36) mentioned the risk of complications of surgery. 67% (n=56) did not have any reference to sources for the information they detailed and 57% (n=47) had no apparent date for the production of the information they offered. Overall 17% (n=14) of websites were judged as being of high quality, with 29% (n=24) being of moderate quality and 54% (n=45) being of low quality. The quality of health information regarding hip and knee arthritis on the internet is highly variable and the majority of websites assessed were of poor quality. A preponderance of websites were funded by a commercial surgical service offering athroplasty at consumer cost, with a further third being funded indirectly via advertising revenues from commercial businesses. The vast majority of websites only mentioned surgery as a treatment and nearly half of all websites did not mention the risks or complications of surgical intervention at all. This has implications for the consent process. As such, Clinicians should be aware of the heterogeneous nature of patient information on the internet and be prepared to advise their patients about good quality websites where further reliable information can be sought.

Keywords: hip osteoarthritis, informed consent, knee osteoarthritis, patient information

Procedia PDF Downloads 93
14055 Wave State of Self: Findings of Synchronistic Patterns in the Collective Unconscious

Authors: R. Dimitri Halley

Abstract:

The research within Jungian Psychology presented here is on the wave state of Self. What has been discovered via shared dreaming, independently correlating dreams across dreamers, is beyond the Self stage into the deepest layer or the wave state Self: the very quantum ocean, the Self archetype is embedded in. A quantum wave or rhyming of meaning constituting synergy across several dreamers was discovered in dreams and in extensively shared dream work with small groups at a post therapy stage. Within the format of shared dreaming, we find synergy patterns beyond what Jung called the Self archetype. Jung led us up to the phase of Individuation and delivered the baton to Von Franz to work out the next synchronistic stage, here proposed as the finding of the quantum patterns making up the wave state of Self. These enfolded synchronistic patterns have been found in group format of shared dreaming of individuals approximating individuation, and the unfolding of it is carried by belief and faith. The reason for this format and operating system is because beyond therapy and of living reality, we find no science – no thinking or even awareness in the therapeutic sense – but rather a state of mental processing resembling more like that of spiritual attitude. Thinking as such is linear and cannot contain the deepest layer of Self, the quantum core of the human being. It is self reflection which is the container for the process at the wave state of Self. Observation locks us in an outside-in reactive flow from a first-person perspective and hence toward the surface we see to believe, whereas here, the direction of focus shifts to inside out/intrinsic. The operating system or language at the wave level of Self is thus belief and synchronicity. Belief has up to now been almost the sole province of organized religions but was viewed by Jung as an inherent property in the process of Individuation. The shared dreaming stage of the synchronistic patterns forms a larger story constituting a deep connectivity unfolding around individual Selves. Dreams of independent dreamers form larger patterns that come together as puzzles forming a larger story, and in this sense, this group work level builds on Jung as a post individuation collective stage. Shared dream correlations will be presented, illustrating a larger story in terms of trails of shared synchronicity.

Keywords: belief, shared dreaming, synchronistic patterns, wave state of self

Procedia PDF Downloads 196
14054 Volunteered Geographic Information Coupled with Wildfire Fire Progression Maps: A Spatial and Temporal Tool for Incident Storytelling

Authors: Cassandra Hansen, Paul Doherty, Chris Ferner, German Whitley, Holly Torpey

Abstract:

Wildfire is a natural and inevitable occurrence, yet changing climatic conditions have increased the severity, frequency, and risk to human populations in the wildland/urban interface (WUI) of the Western United States. Rapid dissemination of accurate wildfire information is critical to both the Incident Management Team (IMT) and the affected community. With the advent of increasingly sophisticated information systems, GIS can now be used as a web platform for sharing geographic information in new and innovative ways, such as virtual story map applications. Crowdsourced information can be extraordinarily useful when coupled with authoritative information. Information abounds in the form of social media, emergency alerts, radio, and news outlets, yet many of these resources lack a spatial component when first distributed. In this study, we describe how twenty-eight volunteer GIS professionals across nine Geographic Area Coordination Centers (GACC) sourced, curated, and distributed Volunteered Geographic Information (VGI) from authoritative social media accounts focused on disseminating information about wildfires and public safety. The combination of fire progression maps with VGI incident information helps answer three critical questions about an incident, such as: where the first started. How and why the fire behaved in an extreme manner and how we can learn from the fire incident's story to respond and prepare for future fires in this area. By adding a spatial component to that shared information, this team has been able to visualize shared information about wildfire starts in an interactive map that answers three critical questions in a more intuitive way. Additionally, long-term social and technical impacts on communities are examined in relation to situational awareness of the disaster through map layers and agency links, the number of views in a particular region of a disaster, community involvement and sharing of this critical resource. Combined with a GIS platform and disaster VGI applications, this workflow and information become invaluable to communities within the WUI and bring spatial awareness for disaster preparedness, response, mitigation, and recovery. This study highlights progression maps as the ultimate storytelling mechanism through incident case studies and demonstrates the impact of VGI and sophisticated applied cartographic methodology make this an indispensable resource for authoritative information sharing.

Keywords: storytelling, wildfire progression maps, volunteered geographic information, spatial and temporal

Procedia PDF Downloads 176
14053 Education Quality Development for Excellence Performance with Higher Education by Using COBIT 5

Authors: Kemkanit Sanyanunthana

Abstract:

The purpose of this research is to study the management system of information technology which supports the education of five private universities in Thailand, according to the case studies which have been developing their qualities and standards of management and education by service provision of information technology to support the excellence performance. The concept to connect information technology with a suitable system has been created by information technology administrators for development, as a system that can be used throughout the organizations to help reach the utmost benefits of using all resources. Hence, the researcher as a person who has been performing these duties within higher education is interested to do this research by selecting the Control Objective for Information and Related Technology 5 (COBIT 5) for the Malcolm Baldrige National Quality Award (MBNQA) of America, or the National Award which applies the concept of Total Quality Management (TQM) to the organization evaluation. Such evaluation is called the Education Criteria for Performance Excellence (EdPEx) focuses on studying and comparing education quality development for excellent performance using COBIT 5 in terms of information technology to study the problems and obstacles of the investigation process for an information technology system, which is considered as an instrument to drive all organizations to reach the excellence performance of the information technology, and to be the model of evaluation and analysis of the process to be in accordance with the strategic plans of the information technology in the universities. This research is conducted in the form of descriptive and survey research according to the case studies. The data collection were carried out by using questionnaires through the administrators working related to the information technology field, and the research documents related to the change management as the main study. The research can be concluded that the performance based on the APO domain process (ALIGN, PLAN AND ORGANISE) of the COBIT 5 standard frame, which emphasizes concordant governance and management of strategic plans for the organizations, could reach only 95%. This might be because of some restrictions such as organizational cultures; therefore, the researcher has studied and analyzed the management of information technology in universities as a whole, under the organizational structures, to reach the performance in accordance with the overall APO domain which would affect the determined strategic plans to be able to develop based on the excellence performance of information technology, and to apply the risk management system at the organizational level into every performance process which would develop the work effectiveness for the resources management of information technology to reach the utmost benefits. 

Keywords: COBIT5, APO, EdPEx Criteria, MBNQA

Procedia PDF Downloads 326
14052 The Covid-19 Pandemic: Transmission, Misinformation, and Implications on Public Health

Authors: Jonathan De Rothewelle

Abstract:

A pandemic, such as that of COVID-19, can be a time of panic and stress; concerns about health supersede others such as work and leisure. With such concern comes the seeking of crucial information— information that, during a global health crisis, could mean the difference between life and death. Whether newspapers, cable news, or radio, media plays an important role in the transmission of medical information to the general public. Moreover, the news media in particular must uphold its obligation to the public to only disseminate factual, useful information. The circulation of misinformation, whether explicit or implicit, may profoundly impact global health. Using a discursive analytic framework founded in linguistics, the images and headlines of top coverage of COVID-19 from the most influential media outlets will be examined. Micro-analyses reveal what may be interpreted as evidence of sensationalism, which may be argued to a form of misinformation, and ultimately a departure from ethical media. Withdrawal from responsible reporting and publishing, expressly in times of epidemic, may cause further confusion and panic.

Keywords: public health, pandemic, public education, media

Procedia PDF Downloads 151
14051 Privacy Preservation Concerns and Information Disclosure on Social Networks: An Ongoing Research

Authors: Aria Teimourzadeh, Marc Favier, Samaneh Kakavand

Abstract:

The emergence of social networks has revolutionized the exchange of information. Every behavior on these platforms contributes to the generation of data known as social network data that are processed, stored and published by the social network service providers. Hence, it is vital to investigate the role of these platforms in user data by considering the privacy measures, especially when we observe the increased number of individuals and organizations engaging with the current virtual platforms without being aware that the data related to their positioning, connections and behavior is uncovered and used by third parties. Performing analytics on social network datasets may result in the disclosure of confidential information about the individuals or organizations which are the members of these virtual environments. Analyzing separate datasets can reveal private information about relationships, interests and more, especially when the datasets are analyzed jointly. Intentional breaches of privacy is the result of such analysis. Addressing these privacy concerns requires an understanding of the nature of data being accumulated and relevant data privacy regulations, as well as motivations for disclosure of personal information on social network platforms. Some significant points about how user's online information is controlled by the influence of social factors and to what extent the users are concerned about future use of their personal information by the organizations, are highlighted in this paper. Firstly, this research presents a short literature review about the structure of a network and concept of privacy in Online Social Networks. Secondly, the factors of user behavior related to privacy protection and self-disclosure on these virtual communities are presented. In other words, we seek to demonstrates the impact of identified variables on user information disclosure that could be taken into account to explain the privacy preservation of individuals on social networking platforms. Thirdly, a few research directions are discussed to address this topic for new researchers.

Keywords: information disclosure, privacy measures, privacy preservation, social network analysis, user experience

Procedia PDF Downloads 282
14050 Technoscience in the Information Society

Authors: A. P. Moiseeva, Z. S. Zavyalova

Abstract:

This paper focuses on the Technoscience phenomenon and its role in modern society. It gives a review of the latest research on Technoscience. Based on the works of Paul Forman, Bernadette Bensaude-Vincent, Bruno Latour, Maria Caramez Carlotto and others, the authors consider the concept of Technoscience, its specific character and prospects of its development.

Keywords: technoscience, information society, transdisciplinarity, European Technology Platforms

Procedia PDF Downloads 664
14049 An Intelligent Nondestructive Testing System of Ultrasonic Infrared Thermal Imaging Based on Embedded Linux

Authors: Hao Mi, Ming Yang, Tian-yue Yang

Abstract:

Ultrasonic infrared nondestructive testing is a kind of testing method with high speed, accuracy and localization. However, there are still some problems, such as the detection requires manual real-time field judgment, the methods of result storage and viewing are still primitive. An intelligent non-destructive detection system based on embedded linux is put forward in this paper. The hardware part of the detection system is based on the ARM (Advanced Reduced Instruction Set Computer Machine) core and an embedded linux system is built to realize image processing and defect detection of thermal images. The CLAHE algorithm and the Butterworth filter are used to process the thermal image, and then the boa server and CGI (Common Gateway Interface) technology are used to transmit the test results to the display terminal through the network for real-time monitoring and remote monitoring. The system also liberates labor and eliminates the obstacle of manual judgment. According to the experiment result, the system provides a convenient and quick solution for industrial non-destructive testing.

Keywords: remote monitoring, non-destructive testing, embedded Linux system, image processing

Procedia PDF Downloads 224
14048 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics

Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood

Abstract:

We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.

Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka

Procedia PDF Downloads 394
14047 Closed-Form Sharma-Mittal Entropy Rate for Gaussian Processes

Authors: Septimia Sarbu

Abstract:

The entropy rate of a stochastic process is a fundamental concept in information theory. It provides a limit to the amount of information that can be transmitted reliably over a communication channel, as stated by Shannon's coding theorems. Recently, researchers have focused on developing new measures of information that generalize Shannon's classical theory. The aim is to design more efficient information encoding and transmission schemes. This paper continues the study of generalized entropy rates, by deriving a closed-form solution to the Sharma-Mittal entropy rate for Gaussian processes. Using the squeeze theorem, we solve the limit in the definition of the entropy rate, for different values of alpha and beta, which are the parameters of the Sharma-Mittal entropy. In the end, we compare it with Shannon and Rényi's entropy rates for Gaussian processes.

Keywords: generalized entropies, Sharma-Mittal entropy rate, Gaussian processes, eigenvalues of the covariance matrix, squeeze theorem

Procedia PDF Downloads 519
14046 A Geometric Based Hybrid Approach for Facial Feature Localization

Authors: Priya Saha, Sourav Dey Roy Jr., Debotosh Bhattacharjee, Mita Nasipuri, Barin Kumar De, Mrinal Kanti Bhowmik

Abstract:

Biometric face recognition technology (FRT) has gained a lot of attention due to its extensive variety of applications in both security and non-security perspectives. It has come into view to provide a secure solution in identification and verification of person identity. Although other biometric based methods like fingerprint scans, iris scans are available, FRT is verified as an efficient technology for its user-friendliness and contact freeness. Accurate facial feature localization plays an important role for many facial analysis applications including biometrics and emotion recognition. But, there are certain factors, which make facial feature localization a challenging task. On human face, expressions can be seen from the subtle movements of facial muscles and influenced by internal emotional states. These non-rigid facial movements cause noticeable alterations in locations of facial landmarks, their usual shapes, which sometimes create occlusions in facial feature areas making face recognition as a difficult problem. The paper proposes a new hybrid based technique for automatic landmark detection in both neutral and expressive frontal and near frontal face images. The method uses the concept of thresholding, sequential searching and other image processing techniques for locating the landmark points on the face. Also, a Graphical User Interface (GUI) based software is designed that could automatically detect 16 landmark points around eyes, nose and mouth that are mostly affected by the changes in facial muscles. The proposed system has been tested on widely used JAFFE and Cohn Kanade database. Also, the system is tested on DeitY-TU face database which is created in the Biometrics Laboratory of Tripura University under the research project funded by Department of Electronics & Information Technology, Govt. of India. The performance of the proposed method has been done in terms of error measure and accuracy. The method has detection rate of 98.82% on JAFFE database, 91.27% on Cohn Kanade database and 93.05% on DeitY-TU database. Also, we have done comparative study of our proposed method with other techniques developed by other researchers. This paper will put into focus emotion-oriented systems through AU detection in future based on the located features.

Keywords: biometrics, face recognition, facial landmarks, image processing

Procedia PDF Downloads 412
14045 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 86
14044 An Efficient FPGA Realization of Fir Filter Using Distributed Arithmetic

Authors: M. Iruleswari, A. Jeyapaul Murugan

Abstract:

Most fundamental part used in many Digital Signal Processing (DSP) application is a Finite Impulse Response (FIR) filter because of its linear phase, stability and regular structure. Designing a high-speed and hardware efficient FIR filter is a very challenging task as the complexity increases with the filter order. In most applications the higher order filters are required but the memory usage of the filter increases exponentially with the order of the filter. Using multipliers occupy a large chip area and need high computation time. Multiplier-less memory-based techniques have gained popularity over past two decades due to their high throughput processing capability and reduced dynamic power consumption. This paper describes the design and implementation of highly efficient Look-Up Table (LUT) based circuit for the implementation of FIR filter using Distributed arithmetic algorithm. It is a multiplier less FIR filter. The LUT can be subdivided into a number of LUT to reduce the memory usage of the LUT for higher order filter. Analysis on the performance of various filter orders with different address length is done using Xilinx 14.5 synthesis tool. The proposed design provides less latency, less memory usage and high throughput.

Keywords: finite impulse response, distributed arithmetic, field programmable gate array, look-up table

Procedia PDF Downloads 458