Search results for: secret information
9668 Analyzing Apposition and the Typology of Specific Reference in Newspaper Discourse in Nigeria
Authors: Monday Agbonica Bello Eje
Abstract:
The language of the print media is characterized by the use of apposition. This linguistic element function strategically in journalistic discourse where it is communicatively necessary to name individuals and provide information about them. Linguistic studies on the language of the print media with bias for apposition have largely dwelt on other areas but the examination of the typology of appositive reference in newspaper discourse. Yet, it is capable of revealing ways writers communicate and provide information necessary for readers to follow and understand the message. The study, therefore, analyses the patterns of appositional occurrences and the typology of reference in newspaper articles. The data were obtained from The Punch and Daily Trust Newspapers. A total of six editions of these newspapers were collected randomly spread over three months. News and feature articles were used in the analysis. Guided by the referential theory of meaning in discourse, the appositions identified were subjected to analysis. The findings show that the semantic relation of coreference and speaker coreference have the highest percentage and frequency of occurrence in the data. This is because the subject matter of news reports and feature articles focuses on humans and the events around them; as a result, readers need to be provided with some form of detail and background information in order to identify as well as follow the discourse. Also, the non-referential relation of absolute synonymy and speaker synonymy no doubt have fewer occurrences and percentages in the analysis. This is tied to a major feature of the language of the media: simplicity. The paper concludes that appositions is mainly used for the purpose of providing the reader with much detail. In this way, the writer transmits information which helps him not only to give detailed yet concise descriptions but also in some way help the reader to follow the discourse.Keywords: apposition, discourse, newspaper, Nigeria, reference
Procedia PDF Downloads 1749667 Knowledge Management in Academic: A Perspective of Academic Research Contribution to Economic Development of a Nation
Authors: Hilary J. Watsilla, Narasimha R. Vajjhala
Abstract:
Information and Communication Technology (ICT) has made information access easier and affordable. Academic research has also benefited from this, with online journals and academic resource readily available by the click of a button. However, there are limited ways of assessing and controlling the quality of the academic research mostly in public institution. Nigeria is the most populous country in Africa with a significant number of universities and young population. The quality of knowledge created by academic researchers, however, needs to be evaluated due to the high number of predatory journals published by academia. The purpose of this qualitative study is to look at the knowledge creation, acquisition, and assimilation process by academic researchers in public universities in Nigeria. Qualitative research will be carried out using in-depth interviews and observations. Academic researchers will be interviewed and absorptive capacity theory will be used as the theoretical framework to guide the research. The findings from this study should help understand the impact of ICT on the knowledge creation process in academic research and to understand how ICT can affect the quality of knowledge produced by researchers. The findings from this study should help add value to the existing body of knowledge on the quality of academic research, especially in Africa where there is limited availability of quality academic research. As this study is limited to Nigerian universities, the outcome may not be generalized to other developing countries.Keywords: knowledge creation, academic research, university, information and communication technology
Procedia PDF Downloads 1549666 Information Extraction for Short-Answer Question for the University of the Cordilleras
Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo
Abstract:
Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.Keywords: information extraction, short-answer question, natural language processing, application
Procedia PDF Downloads 4289665 Detecting and Thwarting Interest Flooding Attack in Information Centric Network
Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S
Abstract:
Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy
Procedia PDF Downloads 2069664 Text Data Preprocessing Library: Bilingual Approach
Authors: Kabil Boukhari
Abstract:
In the context of information retrieval, the selection of the most relevant words is a very important step. In fact, the text cleaning allows keeping only the most representative words for a better use. In this paper, we propose a library for the purpose text preprocessing within an implemented application to facilitate this task. This study has two purposes. The first, is to present the related work of the various steps involved in text preprocessing, presenting the segmentation, stemming and lemmatization algorithms that could be efficient in the rest of study. The second, is to implement a developed tool for text preprocessing in French and English. This library accepts unstructured text as input and provides the preprocessed text as output, based on a set of rules and on a base of stop words for both languages. The proposed library has been made on different corpora and gave an interesting result.Keywords: text preprocessing, segmentation, knowledge extraction, normalization, text generation, information retrieval
Procedia PDF Downloads 949663 Challenges of Effective Management in Tetiary Institutions in Nigeria
Authors: Simon Oga Egboja, Agi Sunday
Abstract:
The government of Nigeria have invested so much in our tertiary education but the desire qualitative goals and objectives are yet to be achieved because management at all level are not efficient and effective in implementing the desired educational policies and programmes due to some management challenges. This paper investigates some of the major challenges to effective management of tertiary institution in Nigeria some variable that are important to effective management includes political stability, adequate funding, establishment of information system, recruitment and appointment of qualified teachers and condition of service.Keywords: effective management includes political stability, adequate funding, establishment of information system, recruitment and appointment of qualified teachers
Procedia PDF Downloads 3069662 Deep Learning to Improve the 5G NR Uplink Control Channel
Authors: Ahmed Krobba, Meriem Touzene, Mohamed Debeyche
Abstract:
The wireless communications system (5G) will provide more diverse applications and higher quality services for users compared to the long-term evolution 4G (LTE). 5G uses a higher carrier frequency, which suffers from information loss in 5G coverage. Most 5G users often cannot obtain high-quality communications due to transmission channel noise and channel complexity. Physical Uplink Control Channel (PUCCH-NR: Physical Uplink Control Channel New Radio) plays a crucial role in 5G NR telecommunication technology, which is mainly used to transmit link control information uplink (UCI: Uplink Control Information. This study based of evaluating the performance of channel physical uplink control PUCCH-NR under low Signal-to-Noise Ratios with various antenna numbers reception. We propose the artificial intelligence approach based on deep neural networks (Deep Learning) to estimate the PUCCH-NR channel in comparison with this approach with different conventional methods such as least-square (LS) and minimum-mean-square-error (MMSE). To evaluate the channel performance we use the block error rate (BLER) as an evaluation criterion of the communication system. The results show that the deep neural networks method gives best performance compared with MMSE and LSKeywords: 5G network, uplink (Uplink), PUCCH channel, NR-PUCCH channel, deep learning
Procedia PDF Downloads 829661 Comparative Study of Greenhouse Locations through Satellite Images and Geographic Information System: Methodological Evaluation in Venezuela
Authors: Maria A. Castillo H., Andrés R. Leandro C.
Abstract:
During the last decades, agricultural productivity in Latin America has increased with precision agriculture and more efficient agricultural technologies. The use of automated systems, satellite images, geographic information systems, and tools for data analysis, and artificial intelligence have contributed to making more effective strategic decisions. Twenty years ago, the state of Mérida, located in the Venezuelan Andes, reported the largest area covered by greenhouses in the country, where certified seeds of potatoes, vegetables, ornamentals, and flowers were produced for export and consumption in the central region of the country. In recent years, it is estimated that production under greenhouses has changed, and the area covered has decreased due to different factors, but there are few historical statistical data in sufficient quantity and quality to support this estimate or to be used for analysis and decision making. The objective of this study is to compare data collected about geoposition, use, and covered areas of the greenhouses in 2007 to data available in 2021, as support for the analysis of the current situation of horticultural production in the main municipalities of the state of Mérida. The document presents the development of the work in the diagnosis and integration of geographic coordinates in GIS and data analysis phases. As a result, an evaluation of the process is made, a dashboard is presented with the most relevant data along with the geographical coordinates integrated into GIS, and an analysis of the obtained information is made. Finally, some recommendations for actions are added, and works that expand the information obtained and its geographical traceability over time are proposed. This study contributes to granting greater certainty in the supporting data for the evaluation of social, environmental, and economic sustainability indicators and to make better decisions according to the sustainable development goals in the area under review. At the same time, the methodology provides improvements to the agricultural data collection process that can be extended to other study areas and crops.Keywords: greenhouses, geographic information system, protected agriculture, data analysis, Venezuela
Procedia PDF Downloads 939660 Interruption Overload in an Office Environment: Hungarian Survey Focusing on the Factors that Affect Job Satisfaction and Work Efficiency
Authors: Fruzsina Pataki-Bittó, Edit Németh
Abstract:
On the one hand, new technologies and communication tools improve employee productivity and accelerate information and knowledge transfer, while on the other hand, information overload and continuous interruptions make it even harder to concentrate at work. It is a great challenge for companies to find the right balance, while there is also an ongoing demand to recruit and retain the talented employees who are able to adopt the modern work style and effectively use modern communication tools. For this reason, this research does not focus on the objective measures of office interruptions, but aims to find those disruption factors which influence the comfort and job satisfaction of employees, and the way how they feel generally at work. The focus of this research is on how employees feel about the different types of interruptions, which are those they themselves identify as hindering factors, and those they feel as stress factors. By identifying and then reducing these destructive factors, job satisfaction can reach a higher level and employee turnover can be reduced. During the research, we collected information from depth interviews and questionnaires asking about work environment, communication channels used in the workplace, individual communication preferences, factors considered as disruptions, and individual steps taken to avoid interruptions. The questionnaire was completed by 141 office workers from several types of workplaces based in Hungary. Even though 66 respondents are working at Hungarian offices of multinational companies, the research is about the characteristics of the Hungarian labor force. The most important result of the research shows that while more than one third of the respondents consider office noise as a disturbing factor, personal inquiries are welcome and considered useful, even if in such cases the work environment will not be convenient to solve tasks requiring concentration. Analyzing the sizes of the offices, in an open-space environment, the rate of those who consider office noise as a disturbing factor is surprisingly lower than in smaller office rooms. Opinions are more diverse regarding information communication technologies. In addition to the interruption factors affecting the employees' job satisfaction, the research also focuses on the role of the offices in the 21st century.Keywords: information overload, interruption, job satisfaction, office environment, work efficiency
Procedia PDF Downloads 2279659 European Commission Radioactivity Environmental Monitoring Database REMdb: A Law (Art. 36 Euratom Treaty) Transformed in Environmental Science Opportunities
Authors: M. Marín-Ferrer, M. A. Hernández, T. Tollefsen, S. Vanzo, E. Nweke, P. V. Tognoli, M. De Cort
Abstract:
Under the terms of Article 36 of the Euratom Treaty, European Union Member States (MSs) shall periodically communicate to the European Commission (EC) information on environmental radioactivity levels. Compilations of the information received have been published by the EC as a series of reports beginning in the early 1960s. The environmental radioactivity results received from the MSs have been introduced into the Radioactivity Environmental Monitoring database (REMdb) of the Institute for Transuranium Elements of the EC Joint Research Centre (JRC) sited in Ispra (Italy) as part of its Directorate General for Energy (DG ENER) support programme. The REMdb brings to the scientific community dealing with environmental radioactivity topics endless of research opportunities to exploit the near 200 millions of records received from MSs containing information of radioactivity levels in milk, water, air and mixed diet. The REM action was created shortly after Chernobyl crisis to support the EC in its responsibilities in providing qualified information to the European Parliament and the MSs on the levels of radioactive contamination of the various compartments of the environment (air, water, soil). Hence, the main line of REM’s activities concerns the improvement of procedures for the collection of environmental radioactivity concentrations for routine and emergency conditions, as well as making this information available to the general public. In this way, REM ensures the availability of tools for the inter-communication and access of users from the Member States and the other European countries to this information. Specific attention is given to further integrate the new MSs with the existing information exchange systems and to assist Candidate Countries in fulfilling these obligations in view of their membership of the EU. Article 36 of the EURATOM treaty requires the competent authorities of each MS to provide regularly the environmental radioactivity monitoring data resulting from their Article 35 obligations to the EC in order to keep EC informed on the levels of radioactivity in the environment (air, water, milk and mixed diet) which could affect population. The REMdb has mainly two objectives: to keep a historical record of the radiological accidents for further scientific study, and to collect the environmental radioactivity data gathered through the national environmental monitoring programs of the MSs to prepare the comprehensive annual monitoring reports (MR). The JRC continues his activity of collecting, assembling, analyzing and providing this information to public and MSs even during emergency situations. In addition, there is a growing concern with the general public about the radioactivity levels in the terrestrial and marine environment, as well about the potential risk of future nuclear accidents. To this context, a clear and transparent communication with the public is needed. EURDEP (European Radiological Data Exchange Platform) is both a standard format for radiological data and a network for the exchange of automatic monitoring data. The latest release of the format is version 2.0, which is in use since the beginning of 2002.Keywords: environmental radioactivity, Euratom, monitoring report, REMdb
Procedia PDF Downloads 4439658 Application of Hyperbinomial Distribution in Developing a Modified p-Chart
Authors: Shourav Ahmed, M. Gulam Kibria, Kais Zaman
Abstract:
Control charts graphically verify variation in quality parameters. Attribute type control charts deal with quality parameters that can only hold two states, e.g., good or bad, yes or no, etc. At present, p-control chart is most commonly used to deal with attribute type data. In construction of p-control chart using binomial distribution, the value of proportion non-conforming must be known or estimated from limited sample information. As the probability distribution of fraction non-conforming (p) is considered in hyperbinomial distribution unlike a constant value in case of binomial distribution, it reduces the risk of false detection. In this study, a statistical control chart is proposed based on hyperbinomial distribution when prior estimate of proportion non-conforming is unavailable and is estimated from limited sample information. We developed the control limits of the proposed modified p-chart using the mean and variance of hyperbinomial distribution. The proposed modified p-chart can also utilize additional sample information when they are available. The study also validates the use of modified p-chart by comparing with the result obtained using cumulative distribution function of hyperbinomial distribution. The study clearly indicates that the use of hyperbinomial distribution in construction of p-control chart yields much accurate estimate of quality parameters than using binomial distribution.Keywords: binomial distribution, control charts, cumulative distribution function, hyper binomial distribution
Procedia PDF Downloads 2799657 Impact of E-Resources and Its Acceessability by Faculty and Research Scholars of Academic Libraries: A Case Study
Authors: M. Jaculine Mary
Abstract:
Today electronic resources are considered as an integral part of information sources to impart efficient services to the people aspiring to acquire knowledge in different fields. E-resources are those resources which include documents in e-format that can be accessed via the Internet in a digital library environment. The present study focuses on accessibility and use of e-resources by faculty and research scholars of academic libraries of Coimbatore, TamilNadu, India. The main objectives are to identify their purpose of using e-resources, know the users’ Information and Communication Technology (ICT) skills, identify satisfaction level of availability of e-resources, use of different e-resources, overall user satisfaction of using e-resources, impact of e-resources on their research and problems faced by them in the access of e-resources. The research methodology adopted to collect data for this study includes analysis of survey reports carried out by distributing questionnaires to the users. The findings of the research are based on the study of responses received from questionnaires distributed to a sample population of 200 users. Among the 200 respondents, 55 percent of research students and 45 percent of faculty members were users of e-resources. It was found that a majority of the users agreed that relevant, updated information at a fast pace had influenced them to use e-resources. Most of the respondents were of the view that more numbers of computers in the library would facilitate quick learning. Academic libraries have to take steps to arrange various training and orientation programmes for research students and faculty members to use the availability of e-resources. This study helps the librarian in planning and development of e-resources to provide modern services to their users of libraries. The study recommends that measures should be taken to increase the accessibility level of e-resource services among the information seekers for increasing the best usage of available electronic resources in the academic libraries.Keywords: academic libraries, accessibility, electronic resources, satisfaction level, survey
Procedia PDF Downloads 1429656 Second Order Journalism: A Study of Selected Niche Authorities on Facebook and Twitter
Authors: Yvonne Dedzo
Abstract:
Social media has become a powerful tool in bridging the distance between individuals regardless of their location. It has become a convenient platform for public discussion and, consequently, generated the phenomenon of citizen journalists who have become both proactive and reactive participants in the dissemination of news, information and other epochal and historical events. This phenomenon has fueled the growth of niche authorities who deliver exceptional democratically consequential information online. This study, therefore, investigates how some selected niche authorities maintain their status on social media. Using the selective processes theory, the study further interrogates the information shared by niche authorities and further analyses the extent to which a public interest-altruistic motive or personal interest-self-serving motive drives their agenda of new sharing and usage. Through cyber-ethnography and, qualitative content analysis and semi-structured interviews, data was gathered and analysed from the posts of two purposely selected niche authorities on Facebook and Twitter. The findings indicate that niche authorities maintain their status by being consistent, prompt, informative, resourceful and interactive in their postings on the social media platform. The study also discovered that even though niche authorities are motivated by both public interest-altruism and interest-self-serving, the latter had a higher of motivation than the former.Keywords: social medida, citizen journalist, niche authorities, selective processes theory
Procedia PDF Downloads 689655 Destination Management Organization in the Digital Era: A Data Framework to Leverage Collective Intelligence
Authors: Alfredo Fortunato, Carmelofrancesco Origlia, Sara Laurita, Rossella Nicoletti
Abstract:
In the post-pandemic recovery phase of tourism, the role of a Destination Management Organization (DMO) as a coordinated management system of all the elements that make up a destination (attractions, access, marketing, human resources, brand, pricing, etc.) is also becoming relevant for local territories. The objective of a DMO is to maximize the visitor's perception of value and quality while ensuring the competitiveness and sustainability of the destination, as well as the long-term preservation of its natural and cultural assets, and to catalyze benefits for the local economy and residents. In carrying out the multiple functions to which it is called, the DMO can leverage a collective intelligence that comes from the ability to pool information, explicit and tacit knowledge, and relationships of the various stakeholders: policymakers, public managers and officials, entrepreneurs in the tourism supply chain, researchers, data journalists, schools, associations and committees, citizens, etc. The DMO potentially has at its disposal large volumes of data and many of them at low cost, that need to be properly processed to produce value. Based on these assumptions, the paper presents a conceptual framework for building an information system to support the DMO in the intelligent management of a tourist destination tested in an area of southern Italy. The approach adopted is data-informed and consists of four phases: (1) formulation of the knowledge problem (analysis of policy documents and industry reports; focus groups and co-design with stakeholders; definition of information needs and key questions); (2) research and metadatation of relevant sources (reconnaissance of official sources, administrative archives and internal DMO sources); (3) gap analysis and identification of unconventional information sources (evaluation of traditional sources with respect to the level of consistency with information needs, the freshness of information and granularity of data; enrichment of the information base by identifying and studying web sources such as Wikipedia, Google Trends, Booking.com, Tripadvisor, websites of accommodation facilities and online newspapers); (4) definition of the set of indicators and construction of the information base (specific definition of indicators and procedures for data acquisition, transformation, and analysis). The framework derived consists of 6 thematic areas (accommodation supply, cultural heritage, flows, value, sustainability, and enabling factors), each of which is divided into three domains that gather a specific information need to be represented by a scheme of questions to be answered through the analysis of available indicators. The framework is characterized by a high degree of flexibility in the European context, given that it can be customized for each destination by adapting the part related to internal sources. Application to the case study led to the creation of a decision support system that allows: •integration of data from heterogeneous sources, including through the execution of automated web crawling procedures for data ingestion of social and web information; •reading and interpretation of data and metadata through guided navigation paths in the key of digital story-telling; •implementation of complex analysis capabilities through the use of data mining algorithms such as for the prediction of tourist flows.Keywords: collective intelligence, data framework, destination management, smart tourism
Procedia PDF Downloads 1219654 Self-Disclosure and Privacy Management Behavior in Social Media: Privacy Calculus Perspective
Authors: Chien-Wen Chen, Nguyen Duong Thuy Trang, Yu-Hsuan Chang
Abstract:
With the development of information technology, social networking sites are inseparable from life and have become an important way for people to communicate. Nonetheless, privacy issues are raised by the presence of personal information on social networking sites. However, users can benefit from using the functions of social networking sites, which also leads to users worrying about the leakage of personal information without corresponding privacy protection behaviors, which is called the privacy paradox. However, previous studies have questioned the viewpoint of the privacy paradox, believing that users are not so naive and that people with privacy concerns will conduct privacy management. Consequently, this study is based on the view of privacy calculation perspective to investigate the privacy behavior of users on social networking sites. Among them, social benefits and privacy concerns are taken as the expected benefits and costs in the viewpoint of privacy calculation. At the same time, this study also explores the antecedents, including positive feedback, self-presentation, privacy policy, and information sensitivity, and the consequence of privacy behavior of weighing benefits and costs, including self-disclosure and three privacy management strategies by interpersonal boundaries (Preventive, Censorship, and Corrective). The survey respondents' characteristics and prior use experience of social networking sites were analyzed. As a consequence, a survey of 596 social network users was conducted online to validate the research framework. The results show that social benefit has the greatest influence on privacy behavior. The most important external factors affecting privacy behavior are positive feedback, followed by the privacy policy and information sensitivity. In addition, the important findings of this study are that social benefits will positively affect privacy management. It shows that users can get satisfaction from interacting with others through social networking sites. They will not only disclose themselves but also manage their privacy on social networking sites after considering social benefits and privacy management on social networking sites, and it expands the adoption of the Privacy Calculus Perspective framework from prior research. Therefore, it is suggested that as the functions of social networking sites increase and the development of social networking sites, users' needs should be understood and updated in order to ensure the sustainable operation of social networking.Keywords: privacy calculus perspective, self-disclosure, privacy management, social benefit, privacy concern
Procedia PDF Downloads 899653 On Musical Information Geometry with Applications to Sonified Image Analysis
Authors: Shannon Steinmetz, Ellen Gethner
Abstract:
In this paper, a theoretical foundation is developed for patterned segmentation of audio using the geometry of music and statistical manifold. We demonstrate image content clustering using conic space sonification. The algorithm takes a geodesic curve as a model estimator of the three-parameter Gamma distribution. The random variable is parameterized by musical centricity and centric velocity. Model parameters predict audio segmentation in the form of duration and frame count based on the likelihood of musical geometry transition. We provide an example using a database of randomly selected images, resulting in statistically significant clusters of similar image content.Keywords: sonification, musical information geometry, image, content extraction, automated quantification, audio segmentation, pattern recognition
Procedia PDF Downloads 2379652 Terraria AI: YOLO Interface for Decision-Making Algorithms
Authors: Emmanuel Barrantes Chaves, Ernesto Rivera Alvarado
Abstract:
This paper presents a method to enable agents for the Terraria game to evaluate algorithms commonly used in general video game artificial intelligence competitions. The usage of the ‘You Only Look Once’ model in the first layer of the process obtains information from the screen, translating this information into a video game description language known as “Video Game Description Language”; the agents take that as input to make decisions. For this, the state-of-the-art algorithms were tested and compared; Monte Carlo Tree Search and Rolling Horizon Evolutionary; in this case, Rolling Horizon Evolutionary shows a better performance. This approach’s main advantage is that a VGDL beforehand is unnecessary. It will be built on the fly and opens the road for using more games as a framework for AI.Keywords: AI, MCTS, RHEA, Terraria, VGDL, YOLOv5
Procedia PDF Downloads 969651 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe
Authors: Elsadig Naseraddeen Ahmed Mohamed
Abstract:
In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon
Procedia PDF Downloads 1759650 Research on Internet Attention of Tourism and Marketing Strategy in Northeast Sichuan Economic Zone in China Based on Baidu Index
Authors: Chuanqiao Zheng, Wei Zeng, Haozhen Lin
Abstract:
As of March 2020, the number of Chinese netizens has reached 904 million. The proportion of Internet users accessing the Internet through mobile phones is as high as 99.3%. Under the background of 'Internet +', tourists have a stronger sense of independence in the choice of tourism destinations and tourism products. Tourists are more inclined to learn about the relevant information on tourism destinations and other tourists' evaluations of tourist products through the Internet. The search engine, as an integrated platform that contains a wealth of information, is highly valuable to the analysis of the characteristics of the Internet attention given to various tourism destinations, through big data mining and analysis. This article uses the Baidu Index as the data source, which is one of the products of Baidu Search. The Baidu Index is based on big data, which collects and shares the search results of a large number of Internet users on the Baidu search engine. The big data used in this article includes search index, demand map, population profile, etc. The main research methods used are: (1) based on the search index, analyzing the Internet attention given to the tourism in five cities in Northeast Sichuan at different times, so as to obtain the overall trend and individual characteristics of tourism development in the region; (2) based on the demand map and the population profile, analyzing the demographic characteristics and market positioning of the tourist groups in these cities to understand the characteristics and needs of the target groups; (3) correlating the Internet attention data with the permanent population of each province in China in the corresponding to construct the Boston matrix of the Internet attention rate of the Northeast Sichuan tourism, obtain the tourism target markets, and then propose development strategies for different markets. The study has found that: a) the Internet attention given to the tourism in the region can be categorized into tourist off-season and peak season; the Internet attention given to tourism in different cities is quite different. b) tourists look for information including tour guide information, ticket information, traffic information, weather information, and information on the competing tourism cities; with regard to the population profile, the main group of potential tourists searching for the keywords of tourism in the five prefecture-level cities in Northeast Sichuan are youth. The male to female ratio is about 6 to 4, with males being predominant. c) through the construction of the Boston matrix, it is concluded that the star market for tourism in the Northeast Sichuan Economic Zone includes Sichuan and Shaanxi; the cash cows market includes Hainan and Ningxia; the question market includes Jiangsu and Shanghai; the dog market includes Hubei and Jiangxi. The study concludes with the following planning strategies and recommendations: i) creating a diversified business format that integrates cultural and tourism; ii) creating a brand image of niche tourism; iii) focusing on the development of tourism products; iv) innovating composite three-dimensional marketing channels.Keywords: Baidu Index, big data, internet attention, tourism
Procedia PDF Downloads 1239649 Application of Molecular Markers for Crop Improvement
Authors: Monisha Isaac
Abstract:
Use of molecular markers for selecting plants with desired traits has been started long back. Due to their heritable characteristics, they are useful for identification and characterization of specific genotypes. The study involves various types of molecular markers used to select multiple desired characters in plants, their properties, and advantages to improve crop productivity in adverse climatological conditions for the purpose of providing food security to fast-growing global population. The study shows that genetic similarities obtained from molecular markers provide more accurate information and the genetic diversity can be better estimated from the genetic relationship obtained from the dendrogram. The information obtained from markers assisted characterization is more suitable for the crops of economic importance like sugarcane.Keywords: molecular markers, crop productivity, genetic diversity, genotype
Procedia PDF Downloads 5179648 InfoMiracles in the Qur’an and a Mathematical Proof to the Existence of God
Authors: Mohammad Mahmoud Mandurah
Abstract:
The existence of InfoMiracles in scripture is evidence that the scripture has a divine origin. It is also evidence to the existence of God. An InfoMiracle is an information-based miracle. The basic component of an InfoMiracle is a piece of information that could not be obtained by a human except through a divine channel. The existence of a sufficient number of convincing InfoMiracles in a scripture necessitates the existence of the divine source to these InfoMiracles. A mathematical equation is developed to prove that the Qur’an has a divine origin, and hence, prove the existence of God. The equation depends on a single variable only, which is the number of InfoMiracles in the Qur’an. The Qur’an is rich with InfoMiracles. It is shown that the existence of less than 30 InfoMiracles in the Qur’an is sufficient proof to the existence of God and that the Qur’an is a revelation from God.Keywords: InfoMiracle, God, mathematical proof, miracle, probability
Procedia PDF Downloads 2189647 Partial Differential Equation-Based Modeling of Brain Response to Stimuli
Authors: Razieh Khalafi
Abstract:
The brain is the information processing centre of the human body. Stimuli in the form of information are transferred to the brain and then brain makes the decision on how to respond to them. In this research, we propose a new partial differential equation which analyses the EEG signals and make a relationship between the incoming stimuli and the brain response to them. In order to test the proposed model, a set of external stimuli applied to the model and the model’s outputs were checked versus the real EEG data. The results show that this model can model the EEG signal well. The proposed model is useful not only for modelling of EEG signal in case external stimuli but it can be used for modelling of brain response in case of internal stimuli.Keywords: brain, stimuli, partial differential equation, response, EEG signal
Procedia PDF Downloads 5549646 A Balance Sheet on the Value of Aid Funding and Delivery: Ingo to NGO Pathways in Nigeria
Authors: Glory Okereke
Abstract:
Several research on the value of aid funding and delivery have emphasized the importance of partnership and accountability in implementing development projects between INGOs and NGOs. Despite challenges in accessing detailed information on their impact due to the extension of information they are willing to provide, this pathway has been seen as an alternative approach and more beneficial than aid funding and delivery through the state. This paper tends to analyze this relationship using liberal and international relations theories to understand the positive and negative aspects of INGO to NGO pathway as a better alternative to economic development. Looking through a broad spectrum of economic development, the paper focuses on Nigeria and analyzes existing empirical literature of INGOs with local NGOs with a comparative analysis of bilateral aid relations with the Nigerian government.Keywords: NGOS, development, Nigeria, liberal theories, aid
Procedia PDF Downloads 369645 Electronic Government around the World: Key Information and Communication Technology Indicators
Authors: Isaac Kofi Mensah
Abstract:
Governments around the world are adopting Information and Communication Technologies (ICTs) because of the important opportunities it provides through E-government (EG) to modernize government public administration processes and delivery of quality and efficient public services. Almost every country in the world is adopting ICT in its public sector administration (EG) to modernize and change the traditional process of government, increase citizen engagement and participation in governance, as well as the provision of timely information to citizens. This paper, therefore, seeks to present the adoption, development and implementation of EG in regions globally, as well as the ICT indicators around the world, which are making EG initiatives successful. Europe leads the world in its EG adoption and development index, followed by the Americas, Asia, Oceania and Africa. There is a gradual growth in ICT indicators in terms of the increase in Internet access and usage, increase in broadband penetration, an increase of individuals using the Internet at home and a decline in fixed telephone use, while the mobile cellular phone has been on the increase year-on-year. Though the lack of ICT infrastructure is a major challenge to EG adoption and implementation around the world, in Africa it is very pervasive, hampering the expansion of Internet access and provision of broadband, and hence is a barrier to the successful adoption, development, and implementation of EG initiatives in countries on the continent. But with the general improvement and increase in ICT indicators around the world, it provides countries in Europe, Americas, Asia, Arab States, Oceania and Africa with the huge opportunity to enhance public service delivery through the adoption of EG. Countries within these regions cannot fail their citizens who desire to enjoy an enhanced and efficient public service delivery from government and its many state institutions.Keywords: e-government development index, e-government, indicators, information and communication technologies (ICTs)
Procedia PDF Downloads 3029644 Binocular Heterogeneity in Saccadic Suppression
Authors: Evgeny Kozubenko, Dmitry Shaposhnikov, Mikhail Petrushan
Abstract:
This work is focused on the study of the binocular characteristics of the phenomenon of perisaccadic suppression in humans when perceiving visual objects. This phenomenon manifests in a decrease in the subject's ability to perceive visual information during saccades, which play an important role in purpose-driven behavior and visual perception. It was shown that the impairment of perception of visual information in the post-saccadic time window is stronger (p < 0.05) in the ipsilateral eye (the eye towards which the saccade occurs). In addition, the observed heterogeneity of post-saccadic suppression in the contralateral and ipsilateral eyes may relate to depth perception. Taking the studied phenomenon into account is important when developing ergonomic control panels in modern operator systems.Keywords: eye movement, natural vision, saccadic suppression, visual perception
Procedia PDF Downloads 1569643 Remote Sensing and GIS Integration for Paddy Production Estimation in Bali Province, Indonesia
Authors: Sarono, Hamim Zaky Hadibasyir, dan Ridho Kurniawan
Abstract:
Estimation of paddy production is one of the areas that can be examined using the techniques of remote sensing and geographic information systems (GIS) in the field of agriculture. The purpose of this research is to know the amount of the paddy production estimation and how remote sensing and geographic information systems (GIS) are able to perform analysis of paddy production estimation in Tegalallang and Payangan Sub district, Bali Province, Indonesia. The method used is the method of land suitability. This method associates a physical parameters which are to be embodied in the smallest unit of a mapping that represents a mapping unit in a particular field and connecting with its field productivity. Analysis of estimated production using standard land suitability from FAO using matching technique. The parameters used to create the land unit is slope (FAO), climate classification (Oldeman), landform (Prapto Suharsono), and soil type. Land use map consist of paddy and non paddy field information obtained from Geo-eye 1 imagery using visual interpretation technique. Landsat image of the Data used for the interpretation of the landform, the classification of the slopes obtained from high point identification with method of interpolation spline, whereas climate data, soil, use secondary data originating from institutions-related institutions. The results of this research indicate Tegallalang and Payangan Districts in known wetland suitability consists of S1 (very suitable) covering an area of 2884,7 ha with the productivity of 5 tons/ha and S2 (suitable) covering an area of 482,9 ha with the productivity of 3 tons/ha. The sum of paddy production estimation as a results in both districts are 31.744, 3 tons in one year.Keywords: production estimation, paddy, remote sensing, geography information system, land suitability
Procedia PDF Downloads 3429642 Use of Machine Learning in Data Quality Assessment
Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho
Abstract:
Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.Keywords: machine learning, data quality, quality dimension, quality assessment
Procedia PDF Downloads 1489641 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm
Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar
Abstract:
The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations
Procedia PDF Downloads 4159640 Exploring the Dark Side of IT Security: Delphi Study on Business’ Influencing Factors
Authors: Tizian Matschak, Ilja Nastjuk, Stephan Kühnel, Simon Trang
Abstract:
We argue that besides well-known primary effects of information security controls (ISCs), namely confidentiality, integrity, and availability, ISCs can also have secondary effects. For example, while IT can add business value through impacts on business processes, ISCs can be a barrier and distort the relationship between IT and organizational value through the impact on business processes. By applying the Delphi method with 28 experts, we derived 27 business process influence dimensions of ISCs. Defining and understanding these mechanisms can change the common understanding of the cost-benefit valuation of IT security investments and support managers' effective and efficient decision-making.Keywords: business process dimensions, dark side of information security, Delphi study, IT security controls
Procedia PDF Downloads 1129639 The Interleaving Effect of Subject Matter and Perceptual Modality on Students’ Attention and Learning: A Portable EEG Study
Authors: Wen Chen
Abstract:
To investigate the interleaving effect of subject matter (mathematics vs. history) and perceptual modality (visual vs. auditory materials) on student’s attention and learning outcomes, the present study collected self-reported data on subjective cognitive load (SCL) and attention level, EEG data, and learning outcomes from micro-lectures. Eighty-one 7th grade students were randomly assigned to four learning conditions: blocked (by subject matter) micro-lectures with auditory textual information (B-A condition), blocked (by subject matter) micro-lectures with visual textual information (B-V condition), interleaved (by subject matter) micro-lectures with auditory textual information (I-A condition), and interleaved micro-lectures by both perceptual modality and subject matter (I-all condition). The results showed that although interleaved conditions may show advantages in certain indices, the I-all condition showed the best overall outcomes (best performance, low SCL, and high attention). This study suggests that interleaving by both subject matter and perceptual modality should be preferred in scheduling and planning classes.Keywords: cognitive load, interleaving effect, micro-lectures, sustained attention
Procedia PDF Downloads 137