Search results for: minority forms of information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15703

Search results for: minority forms of information processing

15403 Legal Means for Access to Information Management

Authors: Sameut Bouhaik Mostafa

Abstract:

Information Act is the Canadian law gives the right of access to information for the institution of government. It declares the availability of government information to the public, but that exceptions should be limited and the necessary right of access to be specific, and also states the need to constantly re-examine the decisions on the disclosure of any government information independently from the government. By 1982, it enacted a dozen countries, including France, Denmark, Finland, Sweden, the Netherlands and the United States (1966) newly legally to access the information. It entered access to Canadian information into force of the Act of 1983, under the government of Pierre Trudeau, allowing Canadians to recover information from government files, and the development of what can be accessed from the information, and the imposition of timetables to respond. It has been applied by the Information Commissioner in Canada.

Keywords: law, information, management, legal

Procedia PDF Downloads 397
15402 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 64
15401 A Review of Research on Pre-training Technology for Natural Language Processing

Authors: Moquan Gong

Abstract:

In recent years, with the rapid development of deep learning, pre-training technology for natural language processing has made great progress. The early field of natural language processing has long used word vector methods such as Word2Vec to encode text. These word vector methods can also be regarded as static pre-training techniques. However, this context-free text representation brings very limited improvement to subsequent natural language processing tasks and cannot solve the problem of word polysemy. ELMo proposes a context-sensitive text representation method that can effectively handle polysemy problems. Since then, pre-training language models such as GPT and BERT have been proposed one after another. Among them, the BERT model has significantly improved its performance on many typical downstream tasks, greatly promoting the technological development in the field of natural language processing, and has since entered the field of natural language processing. The era of dynamic pre-training technology. Since then, a large number of pre-trained language models based on BERT and XLNet have continued to emerge, and pre-training technology has become an indispensable mainstream technology in the field of natural language processing. This article first gives an overview of pre-training technology and its development history, and introduces in detail the classic pre-training technology in the field of natural language processing, including early static pre-training technology and classic dynamic pre-training technology; and then briefly sorts out a series of enlightening technologies. Pre-training technology, including improved models based on BERT and XLNet; on this basis, analyze the problems faced by current pre-training technology research; finally, look forward to the future development trend of pre-training technology.

Keywords: natural language processing, pre-training, language model, word vectors

Procedia PDF Downloads 33
15400 Changing from Crude (Rudimentary) to Modern Method of Cassava Processing in the Ngwo Village of Njikwa Sub Division of North West Region of Cameroon

Authors: Loveline Ambo Angwah

Abstract:

The processing of cassava from tubers or roots into food using crude and rudimentary method (hand peeling, grating, frying and to sun drying) is a very cumbersome and difficult process. The crude methods are time consuming and labour intensive. While on the other hand, modern processing method, that is using machines to perform the various processes as washing, peeling, grinding, oven drying, fermentation and frying is easier, less time consuming, and less labour intensive. Rudimentarily, cassava roots are processed into numerous products and utilized in various ways according to local customs and preferences. For the people of Ngwo village, cassava is transformed locally into flour or powder form called ‘cumcum’. It is also sucked into water to give a kind of food call ‘water fufu’ and fried to give ‘garri’. The leaves are consumed as vegetables. Added to these, its relative high yields; ability to stay underground after maturity for long periods give cassava considerable advantage as a commodity that is being used by poor rural folks in the community, to fight poverty. It plays a major role in efforts to alleviate the food crisis because of its efficient production of food energy, year-round availability, tolerance to extreme stress conditions, and suitability to present farming and food systems in Africa. Improvement of cassava processing and utilization techniques would greatly increase labor efficiency, incomes, and living standards of cassava farmers and the rural poor, as well as enhance the-shelf life of products, facilitate their transportation, increase marketing opportunities, and help improve human and livestock nutrition. This paper presents a general overview of crude ways in cassava processing and utilization methods now used by subsistence and small-scale farmers in Ngwo village of the North West region in Cameroon, and examine the opportunities of improving processing technologies. Cassava needs processing because the roots cannot be stored for long because they rot within 3-4 days of harvest. They are bulky with about 70% moisture content, and therefore transportation of the tubers to markets is difficult and expensive. The roots and leaves contain varying amounts of cyanide which is toxic to humans and animals, while the raw cassava roots and uncooked leaves are not palatable. Therefore, cassava must be processed into various forms in order to increase the shelf life of the products, facilitate transportation and marketing, reduce cyanide content and improve palatability.

Keywords: cassava roots, crude ways, food system, poverty

Procedia PDF Downloads 150
15399 Metal (Loids) Speciation Using HPLC-ICP-MS Technique in Klodnica River, Upper Silesia, Poland

Authors: Magdalena Jabłońska-Czapla

Abstract:

The work allowed gaining knowledge about redox and speciation changes of As, Cr, and Sb ionic forms in Klodnica River water. This kind of studies never has been conducted in this region of Poland. In study optimized and validated previously HPLC-ICP-MS methods for determination of As, Sb and Cr was used. Separation step was done using high-performance liquid chromatograph equipped with ion-exchange column followed by ICP-MS spectrometer detector. Preliminary studies included determination of the total concentration of As, Sb and Cr, pH, Eh, temperature and conductivity of the water samples. The study was conducted monthly from March to August 2014, at six points on the Klodnica River. The results indicate that exceeded at acceptable concentration of total Cr and Sb was observed in Klodnica River and we should qualify Klodnica River waters below the second purity class. In Klodnica River waters dominates oxidized antimony and arsenic forms, as well as the two forms of chromium Cr(VI) and Cr(III). Studies have also shown the methyl derivative of arsenic's presence.

Keywords: antimony, arsenic, chromium, HPLC-ICP-MS, river water, speciation

Procedia PDF Downloads 398
15398 Understanding Children’s Visual Attention to Personal Protective Equipment Using Eye-Tracking

Authors: Vanessa Cho, Janet Hsiao, Nigel King, Robert Anthonappa

Abstract:

Background: The personal protective equipment (PPE) requirements for health care workers (HCWs) have changed significantly during the COVID-19 pandemic. Aim: To ascertain, using eye-tracking technology, what children notice the most when seeing HCWs in various PPE. Design: A Tobii nano pro-eye-tracking camera tracked 156 children's visual attention while they viewed photographs of HCWs in various PPEs. Eye Movement analysis with Hidden Markov Models (EMHMM) was employed to analyse 624 recordings using two approaches, namely (i) data-driven where children's fixation determined the regions of interest (ROIs), and (ii) fixed ROIs where the investigators predefined the ROIs. Results: Two significant eye movement patterns, namely distributed(85.2%) and selective(14.7%), were identified(P<0.05). Most children fixated primarily on the face regardless of the different PPEs. Children fixated equally on all PPE images in the distributed pattern, while a strong preference for unmasked faces was evident in the selective pattern (P<0.01). Conclusion: Children as young as 2.5 years used a top-down visual search behaviour and demonstrated their face processing ability. Most children did not show a strong visual preference for a specific PPE, while a minority preferred PPE with distinct facial features, namely without masks and loupes.

Keywords: COVID-19, PPE, dentistry, pediatric

Procedia PDF Downloads 67
15397 Retrospective Insight on the Changing Status of the Romanian Language Spoken in the Republic of Moldova

Authors: Gina Aurora Necula

Abstract:

From its transformation into a taboo and its hiding under the so-called “Moldovan language” or under the euphemistic expression “state language” to its regained status recognition as an official language, the Romanian language spoken in the Republic of Moldova has undergone impressive reforms in the last 60 years. Meant to erase the awareness of citizens’ ethnic identity and turn a majority language into a minority one, all the laws and regulations issued on the field succeeded into setting numerous barriers for speakers of Romanian. Either manifested as social constraints or materialized into assumed rejection of mother tongue usage, all these laws have demonstrated their usefulness and major impact on the Romanian-speaking population. This article is the result of our research carried out over 10 years with the support of students, and Moldovan citizens, from the master's degree program "Romanian language - identity and cultural awareness." We present here a retrospective insight of the reforms, laws, and regulations that contributed to the shifted status of the Romanian language from the official language, seen as the language of common use both in the public and private spheres, in the minority language that surrendered its privileged place to the Russian language, firstly in the public sphere, and then, slowly but surely, in the private sphere. Our main goal here is to identify and make speakers understand what the barriers to learning Romanian language are nowadays when the social pressure on using Russian no longer exists.

Keywords: linguistic barriers, lingua franca, private sphere, public sphere, reformation

Procedia PDF Downloads 99
15396 Critical Review of Oceanic and Geological Storage of Carbon Sequestration

Authors: Milad Nooshadi, Alessandro Manzardo

Abstract:

CO₂ emissions in the atmosphere continue to rise, mostly as a result of the combustion of fossil fuels. CO₂ injection into the oceans and geological formation as a process of physical carbon capture are two of the most promising emerging strategies for mitigating climate change and global warming. The purpose of this research is to evaluate the two mentioned methods of CO₂ sequestration and to assess information on previous and current advancements, limitations, and uncertainties associated with carbon sequestration in order to identify possible prospects for ensuring the timely implementation of the technology, such as determining how governments and companies can gain a better understanding of CO₂ storage in terms of which media have the most applicable capacity, which type of injection has the fewer environmental impact, and how much carbon sequestration and storage will cost. The behavior of several forms is characterized as a near field, a far field, and a see-floor in ocean storage, and three medias in geological formations as an oil and gas reservoir, a saline aquifer, and a coal bed. To determine the capacity of various forms of media, an analysis of some models and practical experiments are necessary. Additionally, as a major component of sequestration, the various injection methods into diverse media and their monitoring are associated with a variety of environmental impacts and financial consequences.

Keywords: carbon sequestration, ocean storage, geologic storage, carbon transportation

Procedia PDF Downloads 87
15395 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 37
15394 Employees’ Perception of Organizational Communication in Oyo State Agricultural Development Programme (ADP), Nigeria

Authors: Michael Tunde Ajayi, Oluwakemi Enitan Fapojuwo

Abstract:

The study assessed employees’ perception of organizational communication in Oyo State Agricultural Development Programme and its effect on their job performance. A simple random sampling technique was used to select 120 employees using a structured questionnaire for data collection. Findings showed that 66.7% of the respondents were males and 60.4% were between the ages of 31-40 years. Most (87.5%) of the respondents had tertiary education and majority of the respondents (73.9%) had working experience of 5 years or less. Major perceived leadership styles used in communicating to the employees were that employees were not allowed to send feedbacks (X=3.23), information was usually inadequately passed across to the employees (X=2.52), information are given with explanation (X=2.04), leaders rarely gave information on innovation (X=1.91) and information are usually passed in form of order (X=1.89). However, majority (61.5%) of the respondents perceived that the common communication flow used is downward communication system. Respondents perceived that the effects of organizational communication on their job performance were that they were able to know the constraints within the organization (X= 4.89), solve the problem occurring in the organization (X=4.70) and achieve organization objectives (X= 4.40). However, major constraints affecting organizational communication were that there were no cordial relationship among workers (X=3.33), receivers had poor listening skills (X=3.32) and information were not in simple forms (X=3.29). There was a significant relationship between organizational communication (r= 0.984, p<0.05) and employees’ job performance. The study suggested that managers should encourage cordial relationship among workers in other to ease communication flow in organizations and also use adequate medium of communication in other to make information common within organizations.

Keywords: employees’ perception, organizational communication, effects, job performance

Procedia PDF Downloads 505
15393 Dynamic Store Procedures in Database

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

In recent years, different methods have been proposed to optimize question processing in database. Although different methods have been proposed to optimize the query, but the problem which exists here is that most of these methods destroy the query execution plan after executing the query. This research attempts to solve the above problem by using a combination of methods of communicating with the database (the present questions in the programming code and using store procedures) and making query processing adaptive in database, and proposing a new approach for optimization of query processing by introducing the idea of dynamic store procedures. This research creates dynamic store procedures in the database according to the proposed algorithm. This method has been tested on applied software and results shows a significant improvement in reducing the query processing time and also reducing the workload of DBMS. Other advantages of this algorithm include: making the programming environment a single environment, eliminating the parametric limitations of the stored procedures in the database, making the stored procedures in the database dynamic, etc.

Keywords: relational database, agent, query processing, adaptable, communication with the database

Procedia PDF Downloads 354
15392 The Influence of Positive and Negative Affect on Perception and Judgement

Authors: Annamarija Paula

Abstract:

Modern psychology is divided into three distinct domains: cognition, affect, and conation. Historically, psychology devalued the importance of studying the effect in order to explain human behavior as it supposedly lacked both rational thought and a scientific foundation. As a result, affect remained the least studied domain for years to come. However, the last 30 years have marked a significant change in perspective, claiming that not only is affect highly adaptive, but it also plays a crucial role in cognitive processes. Affective states have a crucial impact on human behavior, which led to fundamental advances in the study of affective states on perception and judgment. Positive affect and negative affect are distinct entities and have different effects on social information processing. In addition, emotions of the same valence are manifested in distinct and unique physiological reactions indicating that not all forms of positive or negative affect are the same or serve the same purpose. The effect plays a vital role in perception and judgments, which impacts the validity and reliability of memory retrieval. The research paper analyzes key findings from the past three decades of observational and empirical research on affective states and cognition. The paper also addresses the limitations connected to the findings and proposes suggestions for possible future research.

Keywords: memory, affect, perception, judgement, mood congruency effect

Procedia PDF Downloads 108
15391 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising

Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri

Abstract:

Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.

Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing

Procedia PDF Downloads 567
15390 Optimized Approach for Secure Data Sharing in Distributed Database

Authors: Ahmed Mateen, Zhu Qingsheng, Ahmad Bilal

Abstract:

In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.

Keywords: ER-schema, electronic record, P2P framework, API, query formulation

Procedia PDF Downloads 314
15389 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 208
15388 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence

Authors: Muhammad Bilal Shaikh

Abstract:

Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.

Keywords: multimodal AI, computer vision, NLP, mineral processing, mining

Procedia PDF Downloads 53
15387 Improved Safety Science: Utilizing a Design Hierarchy

Authors: Ulrica Pettersson

Abstract:

Collection of information on incidents is regularly done through pre-printed incident report forms. These tend to be incomplete and frequently lack essential information. ne consequence is that reports with inadequate information, that do not fulfil analysts’ requirements, are transferred into the analysis process. To improve an incident reporting form, theory in design science, witness psychology and interview and questionnaire research has been used. Previously three experiments have been conducted to evaluate the form and shown significant improved results. The form has proved to capture knowledge, regardless of the incidents’ character or context. The aim in this paper is to describe how design science, in more detail a design hierarchy can be used to construct a collection form for improvements in safety science.

Keywords: data collection, design science, incident reports, safety science

Procedia PDF Downloads 207
15386 Effect of Sub Supercritical CO2 Processing on Microflora and Shelf Life Tempe

Authors: M. Kustyawati, F. Pratama, D. Saputra, A. Wijaya

Abstract:

Tempe composes of not only molds but also bacteria and yeasts. The structure of microorganisms needs to be in balance number in order the tempe to be an acceptable quality for an extended time. Sub supercritical carbon dioxide can be a promising preservation method for tempe as it induces microbial inactivation avoiding alterations of its quality attributes. Fresh tempe were processed using supercritical and sub supercritical CO2 for a defined holding times, then the growth ability of molds and bacteria were analyzed. The results showed that the supercritical CO2 processing for 5 minutes reduced the number of bacteria and molds to 0.30 log cycle and 1.17 log cycles, respectively. In addition, sub supercritical CO2 processing for 20 minutes had fungicidal effect against mold tempe; whereas, the sub supercritical CO2 for 10 minutes had reducing effect against bacteria tempe, and had fungistatic affect against mold tempe. It suggested that sub-supercritical CO2 processing for 10 min could be useful alternative technique for preservation of tempe.

Keywords: tempe, sub supercritical CO2, fungistatic effect, preservation

Procedia PDF Downloads 254
15385 Instructional Information Resources

Authors: Parveen Kumar

Abstract:

This article discusses institute information resources. Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as message information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system. Conceptually, information is the message being conveyed. This concept has numerous other meanings in different contexts. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.

Keywords: institutions, information institutions, information services for mission-oriented institute, pattern

Procedia PDF Downloads 361
15384 Automatic Music Score Recognition System Using Digital Image Processing

Authors: Yuan-Hsiang Chang, Zhong-Xian Peng, Li-Der Jeng

Abstract:

Music has always been an integral part of human’s daily lives. But, for the most people, reading musical score and turning it into melody is not easy. This study aims to develop an Automatic music score recognition system using digital image processing, which can be used to read and analyze musical score images automatically. The technical approaches included: (1) staff region segmentation; (2) image preprocessing; (3) note recognition; and (4) accidental and rest recognition. Digital image processing techniques (e.g., horizontal /vertical projections, connected component labeling, morphological processing, template matching, etc.) were applied according to musical notes, accidents, and rests in staff notations. Preliminary results showed that our system could achieve detection and recognition rates of 96.3% and 91.7%, respectively. In conclusion, we presented an effective automated musical score recognition system that could be integrated in a system with a media player to play music/songs given input images of musical score. Ultimately, this system could also be incorporated in applications for mobile devices as a learning tool, such that a music player could learn to play music/songs.

Keywords: connected component labeling, image processing, morphological processing, optical musical recognition

Procedia PDF Downloads 397
15383 'Pacta Sunt Servanda': Which Form of Contract to Use in the Construction Industry

Authors: Ahmed Stifi, Sascha Gentes

Abstract:

The contract in its simplest definition is an agreement involving parties with a number of documents which may be as little as a marriage contract involving two parties or as big as a contract of construction and operation of a nuclear power plant involving companies and stakeholders with hundreds or even thousands of documents. All parties in the construction industry, not only the contract experts, agree that the success of a project is linked primarily to the form of contract regulating the relationship between stakeholders of the project. Therefore it is essential for the construction industry to study, analyze and improve its contracts forms continuously. However, it should be mentioned that different contract forms are developed to suit the construction evolution in term of its machinery, materials and construction process. There exist some similarities in some clauses and variations in many of these forms depending upon the type of project, the kind of clients and more importantly the laws and regulations governing the transaction in the country where the project is carried out. This paper will discuss the most important forms of construction contracts starting from national level, intended to the contract form in Germany and moving on to the international level introducing FIDIC contracts and its different forms, some newly developed contracts forms namely the integrated form of agreement, the new engineering contract and the project alliance agreement. The result of the study shows that many of the contract’s paragraphs are similar and the main difference comes in the approach of the relationship between the parties. Is it based on co-operation and mutual trust, or in some cases a load of responsibility for a particular party which increases the problems and disputes that affects the success of the project negatively. Thus we can say that the form of the contract, that plays an essential role in the approach of the project management, which is ultimately the key factor for the success of the project. So we advise to use a form of contract, which enhance the mutual trust between the project parties, contribute to support the cooperation between them, distribute responsibility and risks on an equitable basis and build on the principle “win-win". In additional to the conventional role of the contract it should integrate all parties into one team to achieve the target value of the project.

Keywords: contract, FIDIC, integrated form of agreement, new engineering contract, project alliance agreemen

Procedia PDF Downloads 351
15382 Data Integration in a GIS Geographic Information System Mapping of Agriculture in Semi-Arid Region of Setif, Algeria

Authors: W. Riahi, M. L. Mansour

Abstract:

Using tools of data processing such as geographic information system (GIS) for the contribution of the space management becomes more and more frequent. It allows collecting and analyzing diverse natural information relative to the same territory. Space technologies play crucial role in agricultural phenomenon analysis. For this, satellite images treatment were used to classify vegetation density and particularly agricultural areas in Setif province by making recourse to the Normalized Difference Vegetation Index (NDVI). This step was completed by mapping agricultural activities of the province by using ArcGIS.10 software in order to display an overall view and to realize spatial analysis of various themes combined between them which are chosen according to their strategic importance in different thematic maps. The synthesis map elaborately showed that geographic information system can contribute significantly to agricultural management by describing potentialities and development opportunities of production systems and agricultural sectors.

Keywords: GIS, satellite image, agriculture, NDVI, thematic map

Procedia PDF Downloads 409
15381 The Threat Posed by Dominant Languages to Minor Languages or Dialects: The Case of isiZulu and isiBhaca in Umzimkhulu, KwaZulu-Natal

Authors: Yanga Lusanda Praiseworth Majola

Abstract:

The small town of Umzimkhulu is situated in the KwaZulu-Natal province of South Africa and was once the Bantustan of Transkei. Citizens of Umzimkulu are called amaBhaca because they speak isiBhaca, which is a non-standard language but is mutually intelligible to three standard official languages, isiXhosa, isiZulu, and siSwati. Since Umzimkhulu was under the Eastern Cape Province prior to 2006, isiXhosa is used for official purposes, particularly in schools, then isiZulu is used in other sectors; this is despite the fact that the majority of Umzimkhulu citizens regard themselves as amaBhaca. This poses a threat to both isiBhaca as a language and the identity of amaBhaca because Umzimkhulu is situated in KZN, where isiZulu is the dominant language spoken by the majority in the province. The primary objective of this study is to unveil, using the language dominance theory, how dominant languages pose a threat to minority and developing languages or dialects. The study employed a mixed-methods approach. Data was obtained from key community members and leaders who were identified as amaBhaca, who have lived in Umzimkhulu their whole lives. The main findings of the study are that although isiBhaca is classified as a dialect of isiXhosa, linguistically, it is closer to isiZulu, and thus isiZulu poses much threat to the existence of isiBhaca since it becomes easy for amaBhaca to switch from isiBhaca to isiZulu and end up not having an interest in isiBhaca. Respondents revealed that in their view, isiBhaca is a language of its own, and the continuous use and empowerment of isiZulu in Umzimkhulu, particularly in the professional settings, is detrimental to isiBhaca, and this subsequently has the potential of endangering the existence of isiBhaca and might lead to its attrition.

Keywords: language dominance, dominant languages, minority languages, language attrition

Procedia PDF Downloads 69
15380 Effects of Non-Diagnostic Haptic Information on Consumers' Product Judgments and Decisions

Authors: Eun Young Park, Jongwon Park

Abstract:

A physical touch of a product can provide ample diagnostic information about the product attributes and quality. However, consumers’ product judgments and purchases can be erroneously influenced by non-diagnostic haptic information. For example, consumers’ evaluations of the coffee they drink could be affected by the heaviness of a cup that is used for just serving the coffee. This important issue has received little attention in prior research. The present research contributes to the literature by identifying when and how non-diagnostic haptic information can have an influence and why such influence occurs. Specifically, five studies experimentally varied the content of non-diagnostic haptic information, such as the weight of a cup (heavy vs. light) and the texture of a cup holder (smooth vs. rough), and then assessed the impact of the manipulation on product judgments and decisions. Results show that non-diagnostic haptic information has a biasing impact on consumer judgments. For example, the heavy (vs. light) cup increases consumers’ perception of the richness of coffee in it, and the rough (vs. smooth) texture of a cup holder increases the perception of the healthfulness of fruit juice in it, which in turn increases consumers’ purchase intentions of the product. When consumers are cognitively distracted during the touch experience, the impact of the content of haptic information is no longer evident, but the valence (positive vs. negative) of the haptic experience influences product judgments. However, consumers are able to avoid the impact of non-diagnostic haptic information, if and only if they are both knowledgeable about the product category and undistracted from processing the touch experience. In sum, the nature of the influence by non-diagnostic haptic information (i.e., assimilation effect vs. contrast effect vs. null effect) is determined by the content and valence of haptic information, the relative impact of which depends on whether consumers can identify the content and source of the haptic information. Theoretically, to our best knowledge, this research is the first to document the empirical evidence of the interplay between cognitive and affective processes that determines the impact of non-diagnostic haptic information. Managerial implications are discussed.

Keywords: consumer behavior, haptic information, product judgments, touch effect

Procedia PDF Downloads 150
15379 Scheduling in Cloud Networks Using Chakoos Algorithm

Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi

Abstract:

Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.

Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm

Procedia PDF Downloads 45
15378 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 412
15377 A Retrospective Study of the Effects of Xenophobia on South Africa-Nigeria Relations

Authors: O. Fayomi, F. Chidozie, C. Ayo

Abstract:

The underlying causes of xenophobia are complex and varied. Xenophobia has to do with being contemptuous of that which is foreign, especially of strangers or of people from different countries or cultures. Unemployment and mounting poverty among South Africans at the bottom of the economic ladder have provoked fears of the competition that better educated and experienced migrants can represent. South Africa’s long track-record of violence as a means of protest and the targeting of foreigners in particular, and, the documented tensions over migration policy and the scale of repatriation serve a very good explanation for its xenophobia. It was clear that while most of the attacks were directed against foreign, primarily African, migrants, this was not the rule. Attacks were also noted against Chinese-speakers, Pakistani migrants as well as against South Africans from minority language groups (in the conflict areas). Settlements that have recently experienced the expression of ‘xenophobic’ violence have also been the site of violent and other forms of protest around other issues, most notably service delivery. The failure of government in service delivery was vexed on this form of xenophobia. Due to the increase in migration, this conflict is certainly not temporary in nature. Xenophobia manifests in different regions and communities with devastating effects on the affected nationals. Nigerians living in South Africa have been objects of severe attacks and assault as a result of this xenophobic attitude. It is against this background that this study seeks to investigate the xenophobic attacks against Nigerians in South Africa. The methodology is basically qualitative with the use of secondary sources such as books, journals, newspapers and internet sources.

Keywords: xenophobia, unemployment, poverty, Nigeria, South Africa

Procedia PDF Downloads 458
15376 Air–Water Two-Phase Flow Patterns in PEMFC Microchannels

Authors: Ibrahim Rassoul, A. Serir, E-K. Si Ahmed, J. Legrand

Abstract:

The acronym PEM refers to Proton Exchange Membrane or alternatively Polymer Electrolyte Membrane. Due to its high efficiency, low operating temperature (30–80 °C), and rapid evolution over the past decade, PEMFCs are increasingly emerging as a viable alternative clean power source for automobile and stationary applications. Before PEMFCs can be employed to power automobiles and homes, several key technical challenges must be properly addressed. One technical challenge is elucidating the mechanisms underlying water transport in and removal from PEMFCs. On one hand, sufficient water is needed in the polymer electrolyte membrane or PEM to maintain sufficiently high proton conductivity. On the other hand, too much liquid water present in the cathode can cause “flooding” (that is, pore space is filled with excessive liquid water) and hinder the transport of the oxygen reactant from the gas flow channel (GFC) to the three-phase reaction sites. The experimental transparent fuel cell used in this work was designed to represent actual full scale of fuel cell geometry. According to the operating conditions, a number of flow regimes may appear in the microchannel: droplet flow, blockage water liquid bridge /plug (concave and convex forms), slug/plug flow and film flow. Some of flow patterns are new, while others have been already observed in PEMFC microchannels. An algorithm in MATLAB was developed to automatically determine the flow structure (e.g. slug, droplet, plug, and film) of detected liquid water in the test microchannels and yield information pertaining to the distribution of water among the different flow structures. A video processing algorithm was developed to automatically detect dynamic and static liquid water present in the gas channels and generate relevant quantitative information. The potential benefit of this software allows the user to obtain a more precise and systematic way to obtain measurements from images of small objects. The void fractions are also determined based on images analysis. The aim of this work is to provide a comprehensive characterization of two-phase flow in an operating fuel cell which can be used towards the optimization of water management and informs design guidelines for gas delivery microchannels for fuel cells and its essential in the design and control of diverse applications. The approach will combine numerical modeling with experimental visualization and measurements.

Keywords: polymer electrolyte fuel cell, air-water two phase flow, gas diffusion layer, microchannels, advancing contact angle, receding contact angle, void fraction, surface tension, image processing

Procedia PDF Downloads 292
15375 Autism Disease Detection Using Transfer Learning Techniques: Performance Comparison between Central Processing Unit vs. Graphics Processing Unit Functions for Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

Neural network approaches are machine learning methods used in many domains, such as healthcare and cyber security. Neural networks are mostly known for dealing with image datasets. While training with the images, several fundamental mathematical operations are carried out in the Neural Network. The operation includes a number of algebraic and mathematical functions, including derivative, convolution, and matrix inversion and transposition. Such operations require higher processing power than is typically needed for computer usage. Central Processing Unit (CPU) is not appropriate for a large image size of the dataset as it is built with serial processing. While Graphics Processing Unit (GPU) has parallel processing capabilities and, therefore, has higher speed. This paper uses advanced Neural Network techniques such as VGG16, Resnet50, Densenet, Inceptionv3, Xception, Mobilenet, XGBOOST-VGG16, and our proposed models to compare CPU and GPU resources. A system for classifying autism disease using face images of an autistic and non-autistic child was used to compare performance during testing. We used evaluation matrices such as Accuracy, F1 score, Precision, Recall, and Execution time. It has been observed that GPU runs faster than the CPU in all tests performed. Moreover, the performance of the Neural Network models in terms of accuracy increases on GPU compared to CPU.

Keywords: autism disease, neural network, CPU, GPU, transfer learning

Procedia PDF Downloads 96
15374 Performance of Hybrid Image Fusion: Implementation of Dual-Tree Complex Wavelet Transform Technique

Authors: Manoj Gupta, Nirmendra Singh Bhadauria

Abstract:

Most of the applications in image processing require high spatial and high spectral resolution in a single image. For example satellite image system, the traffic monitoring system, and long range sensor fusion system all use image processing. However, most of the available equipment is not capable of providing this type of data. The sensor in the surveillance system can only cover the view of a small area for a particular focus, yet the demanding application of this system requires a view with a high coverage of the field. Image fusion provides the possibility of combining different sources of information. In this paper, we have decomposed the image using DTCWT and then fused using average and hybrid of (maxima and average) pixel level techniques and then compared quality of both the images using PSNR.

Keywords: image fusion, DWT, DT-CWT, PSNR, average image fusion, hybrid image fusion

Procedia PDF Downloads 585