Search results for: information centric network
12434 Modelling the Art Historical Canon: The Use of Dynamic Computer Models in Deconstructing the Canon
Authors: Laura M. F. Bertens
Abstract:
There is a long tradition of visually representing the art historical canon, in schematic overviews and diagrams. This is indicative of the desire for scientific, ‘objective’ knowledge of the kind (seemingly) produced in the natural sciences. These diagrams will, however, always retain an element of subjectivity and the modelling methods colour our perception of the represented information. In recent decades visualisations of art historical data, such as hand-drawn diagrams in textbooks, have been extended to include digital, computational tools. These tools significantly increase modelling strength and functionality. As such, they might be used to deconstruct and amend the very problem caused by traditional visualisations of the canon. In this paper, the use of digital tools for modelling the art historical canon is studied, in order to draw attention to the artificial nature of the static models that art historians are presented with in textbooks and lectures, as well as to explore the potential of digital, dynamic tools in creating new models. To study the way diagrams of the canon mediate the represented information, two modelling methods have been used on two case studies of existing diagrams. The tree diagram Stammbaum der neudeutschen Kunst (1823) by Ferdinand Olivier has been translated to a social network using the program Visone, and the famous flow chart Cubism and Abstract Art (1936) by Alfred Barr has been translated to an ontological model using Protégé Ontology Editor. The implications of the modelling decisions have been analysed in an art historical context. The aim of this project has been twofold. On the one hand the translation process makes explicit the design choices in the original diagrams, which reflect hidden assumptions about the Western canon. Ways of organizing data (for instance ordering art according to artist) have come to feel natural and neutral and implicit biases and the historically uneven distribution of power have resulted in underrepresentation of groups of artists. Over the last decades, scholars from fields such as Feminist Studies, Postcolonial Studies and Gender Studies have considered this problem and tried to remedy it. The translation presented here adds to this deconstruction by defamiliarizing the traditional models and analysing the process of reconstructing new models, step by step, taking into account theoretical critiques of the canon, such as the feminist perspective discussed by Griselda Pollock, amongst others. On the other hand, the project has served as a pilot study for the use of digital modelling tools in creating dynamic visualisations of the canon for education and museum purposes. Dynamic computer models introduce functionalities that allow new ways of ordering and visualising the artworks in the canon. As such, they could form a powerful tool in the training of new art historians, introducing a broader and more diverse view on the traditional canon. Although modelling will always imply a simplification and therefore a distortion of reality, new modelling techniques can help us get a better sense of the limitations of earlier models and can provide new perspectives on already established knowledge.Keywords: canon, ontological modelling, Protege Ontology Editor, social network modelling, Visone
Procedia PDF Downloads 12712433 Deciphering Orangutan Drawing Behavior Using Artificial Intelligence
Authors: Benjamin Beltzung, Marie Pelé, Julien P. Renoult, Cédric Sueur
Abstract:
To this day, it is not known if drawing is specifically human behavior or if this behavior finds its origins in ancestor species. An interesting window to enlighten this question is to analyze the drawing behavior in genetically close to human species, such as non-human primate species. A good candidate for this approach is the orangutan, who shares 97% of our genes and exhibits multiple human-like behaviors. Focusing on figurative aspects may not be suitable for orangutans’ drawings, which may appear as scribbles but may have meaning. A manual feature selection would lead to an anthropocentric bias, as the features selected by humans may not match with those relevant for orangutans. In the present study, we used deep learning to analyze the drawings of a female orangutan named Molly († in 2011), who has produced 1,299 drawings in her last five years as part of a behavioral enrichment program at the Tama Zoo in Japan. We investigate multiple ways to decipher Molly’s drawings. First, we demonstrate the existence of differences between seasons by training a deep learning model to classify Molly’s drawings according to the seasons. Then, to understand and interpret these seasonal differences, we analyze how the information spreads within the network, from shallow to deep layers, where early layers encode simple local features and deep layers encode more complex and global information. More precisely, we investigate the impact of feature complexity on classification accuracy through features extraction fed to a Support Vector Machine. Last, we leverage style transfer to dissociate features associated with drawing style from those describing the representational content and analyze the relative importance of these two types of features in explaining seasonal variation. Content features were relevant for the classification, showing the presence of meaning in these non-figurative drawings and the ability of deep learning to decipher these differences. The style of the drawings was also relevant, as style features encoded enough information to have a classification better than random. The accuracy of style features was higher for deeper layers, demonstrating and highlighting the variation of style between seasons in Molly’s drawings. Through this study, we demonstrate how deep learning can help at finding meanings in non-figurative drawings and interpret these differences.Keywords: cognition, deep learning, drawing behavior, interpretability
Procedia PDF Downloads 16512432 Cognitive Dysfunctioning and the Fronto-Limbic Network in Bipolar Disorder Patients: A Fmri Meta-Analysis
Authors: Rahele Mesbah, Nic Van Der Wee, Manja Koenders, Erik Giltay, Albert Van Hemert, Max De Leeuw
Abstract:
Introduction: Patients with bipolar disorder (BD), characterized by depressive and manic episodes, often suffer from cognitive dysfunction. An up-to-date meta-analysis of functional Magnetic Resonance Imaging (fMRI) studies examining cognitive function in BD is lacking. Objective: The aim of the current fMRI meta-analysis is to investigate brain functioning of bipolar patients compared with healthy subjects within three domains of emotion processing, reward processing, and working memory. Method: Differences in brain regions activation were tested within whole-brain analysis using the activation likelihood estimation (ALE) method. Separate analyses were performed for each cognitive domain. Results: A total of 50 fMRI studies were included: 20 studies used an emotion processing (316 BD and 369 HC) task, 9 studies a reward processing task (215 BD and 213 HC), and 21 studies used a working memory task (503 BD and 445 HC). During emotion processing, BD patients hyperactivated parts of the left amygdala and hippocampus as compared to HC’s, but showed hypoactivation in the inferior frontal gyrus (IFG). Regarding reward processing, BD patients showed hyperactivation in part of the orbitofrontal cortex (OFC). During working memory, BD patients showed increased activity in the prefrontal cortex (PFC) and anterior cingulate cortex (ACC). Conclusions: This meta-analysis revealed evidence for activity disturbances in several brain areas involved in the cognitive functioning of BD patients. Furthermore, most of the found regions are part of the so-called fronto-limbic network which is hypothesized to be affected as a result of BD candidate genes' expression.Keywords: cognitive functioning, fMRI analysis, bipolar disorder, fronto-limbic network
Procedia PDF Downloads 46212431 An Effective Noise Resistant Frequency Modulation Continuous-Wave Radar Vital Sign Signal Detection Method
Authors: Lu Yang, Meiyang Song, Xiang Yu, Wenhao Zhou, Chuntao Feng
Abstract:
To address the problem that the FM continuous-wave radar (FMCW) extracts human vital sign signals which are susceptible to noise interference and low reconstruction accuracy, a new detection scheme for the sign signals is proposed. Firstly, an improved complete ensemble empirical modal decomposition with adaptive noise (ICEEMDAN) algorithm is applied to decompose the radar-extracted thoracic signals to obtain several intrinsic modal functions (IMF) with different spatial scales, and then the IMF components are optimized by a BP neural network improved by immune genetic algorithm (IGA). The simulation results show that this scheme can effectively separate the noise and accurately extract the respiratory and heartbeat signals and improve the reconstruction accuracy and signal-to-noise ratio of the sign signals.Keywords: frequency modulated continuous wave radar, ICEEMDAN, BP neural network, vital signs signal
Procedia PDF Downloads 16512430 The Effect of Critical Activity on Critical Path and Project Duration in Precedence Diagram Method
Abstract:
The additional relationships i.e., start-to-start, finish-to-finish, and start-to-finish, between activity in Precedence Diagram Method (PDM) provides a more flexible schedule than traditional Critical Path Method (CPM). But, changing the duration of critical activities in the PDM network will have an anomalous effect on the critical path and the project completion date. In this study, we classified the critical activities in two groups i.e., 1. activity on single critical path and 2. activity on multi-critical paths, and six classes i.e., normal, reverse, neutral, perverse, decrease-reverse and increase-normal, based on their effects on project duration in PDM. Furthermore, we determined the maximum float of time by which the duration each type of critical activities can be changed without effecting the project duration. This study would help the project manager to clearly understand the behavior of each critical activity on critical path, and he/she would be able to change the project duration by shortening or lengthening activities based on project budget and project deadline.Keywords: construction management, critical path method, project scheduling network, precedence diagram method
Procedia PDF Downloads 22212429 Gender Analysis of the Influence of Sources of Information on the Adoption of Tenera Oil Palm Technology among Smallholder Farmers in Edo State, Nigeria
Authors: Cornelius Michael Ekenta
Abstract:
The research made a gender comparative analysis of the influence of sources of information on the adoption of tenera improved oil palm technology. Purposive, stratified and random sampling techniques were used to sample a total of 292 farmers (155 males and 137 females) for the study. Structured questionnaire was used to obtain primary data used for analysis. Obtained data were analyzed with descriptive statistics and Logit regressions analysis. Findings revealed that radio, extension office, television and farmers’ group were the most preferred sources of information by the farmers both male and female. Also, males perceived information from radio (92%) and farmers’ group (84%) to be available and information from Research Institutes as credible (95%). Similarly, the female perceived information from Research Institutes to be reliable (70%). The study showed that 38% of men adopted the variety, 25% of the women adopted the variety while 32% of both men and women adopted the variety in the study area. Regressions analysis indicated that radio, extension office, television, farmers’ group and research institute were significant at 0.5% of probability for men and female farmers. The study concluded that the adoption of tenera improved oil palm technology was low among male and female farmers though men adopted more than the women. It was recommended therefore that Agricultural Development Programme (ADP) in other states of the country should partner with their state radio and television stations to broadcast agricultural programmes periodically to ensure efficient dissemination of agricultural information to the farmers.Keywords: analysis, Edo, gender, influence, information, sources, tenera
Procedia PDF Downloads 11112428 Similar Script Character Recognition on Kannada and Telugu
Authors: Gurukiran Veerapur, Nytik Birudavolu, Seetharam U. N., Chandravva Hebbi, R. Praneeth Reddy
Abstract:
This work presents a robust approach for the recognition of characters in Telugu and Kannada, two South Indian scripts with structural similarities in characters. To recognize the characters exhaustive datasets are required, but there are only a few publicly available datasets. As a result, we decided to create a dataset for one language (source language),train the model with it, and then test it with the target language.Telugu is the target language in this work, whereas Kannada is the source language. The suggested method makes use of Canny edge features to increase character identification accuracy on pictures with noise and different lighting. A dataset of 45,150 images containing printed Kannada characters was created. The Nudi software was used to automatically generate printed Kannada characters with different writing styles and variations. Manual labelling was employed to ensure the accuracy of the character labels. The deep learning models like CNN (Convolutional Neural Network) and Visual Attention neural network (VAN) are used to experiment with the dataset. A Visual Attention neural network (VAN) architecture was adopted, incorporating additional channels for Canny edge features as the results obtained were good with this approach. The model's accuracy on the combined Telugu and Kannada test dataset was an outstanding 97.3%. Performance was better with Canny edge characteristics applied than with a model that solely used the original grayscale images. The accuracy of the model was found to be 80.11% for Telugu characters and 98.01% for Kannada words when it was tested with these languages. This model, which makes use of cutting-edge machine learning techniques, shows excellent accuracy when identifying and categorizing characters from these scripts.Keywords: base characters, modifiers, guninthalu, aksharas, vattakshara, VAN
Procedia PDF Downloads 5312427 Personalization of Context Information Retrieval Model via User Search Behaviours for Ranking Document Relevance
Authors: Kehinde Agbele, Longe Olumide, Daniel Ekong, Dele Seluwa, Akintoye Onamade
Abstract:
One major problem of most existing information retrieval systems (IRS) is that they provide even access and retrieval results to individual users specially based on the query terms user issued to the system. When using IRS, users often present search queries made of ad-hoc keywords. It is then up to IRS to obtain a precise representation of user’s information need, and the context of the information. In effect, the volume and range of the Internet documents is growing exponentially and consequently causes difficulties for a user to obtain information that precisely matches the user interest. Diverse combination techniques are used to achieve the specific goal. This is due, firstly, to the fact that users often do not present queries to IRS that optimally represent the information they want, and secondly, the measure of a document's relevance is highly subjective between diverse users. In this paper, we address the problem by investigating the optimization of IRS to individual information needs in order of relevance. The paper addressed the development of algorithms that optimize the ranking of documents retrieved from IRS. This paper addresses this problem with a two-fold approach in order to retrieve domain-specific documents. Firstly, the design of context of information. The context of a query determines retrieved information relevance using personalization and context-awareness. Thus, executing the same query in diverse contexts often leads to diverse result rankings based on the user preferences. Secondly, the relevant context aspects should be incorporated in a way that supports the knowledge domain representing users’ interests. In this paper, the use of evolutionary algorithms is incorporated to improve the effectiveness of IRS. A context-based information retrieval system that learns individual needs from user-provided relevance feedback is developed whose retrieval effectiveness is evaluated using precision and recall metrics. The results demonstrate how to use attributes from user interaction behavior to improve the IR effectiveness.Keywords: context, document relevance, information retrieval, personalization, user search behaviors
Procedia PDF Downloads 46312426 Re-Defining Academic Literacy: An Information Literacy Approach to Helping Chinese International Students Succeed in American Colleges
Authors: Yi Ding
Abstract:
With the upsurge of Chinese international students in American higher education, serious academic problems Chinese international students are suffering from are also striking. While most practices and research in higher education focus on the role of professors, writing centers, and tutoring centers to help international students succeed in college, this research study focuses on a more fundamental skill that is neglected in most conversations: information literacy, which is usually addressed by academic librarians. Transitioning from an East-Asian, developing educational system that values authority, set knowledge more than independent thinking, scholarly conversation, Chinese international students need support from academic librarians to acquire information literacy, which is crucial to understand expectations of a Western academic setting and thus to succeed in college. This research study illustrates how academic librarians can play an integral role in helping Chinese international students acclimate to the expectations of American higher education by teaching information literacy as academic literacy unique to the Western academic setting. Six keys of information literacy put forward by Association of College and Research Libraries, which are 'Authority Is Constructed and Contextual', 'Information Creation as a Process', 'Information Has Value', 'Research as Inquiry', 'Scholarship as Conversation', and 'Searching as Strategic Exploration', are analyzed through the lens of Chinese educational system and students’ backgrounds. Based on the analysis as well as results from surveys and interviews among academic librarians, professors, and international students, this research further examines current practices from a wide range of academic libraries and finally, provides evidence-based recommendations for academic librarians to use information literacy instruction to help Chinese international students succeed in American higher education.Keywords: academic librarians, Chinese international students, information literacy, student success
Procedia PDF Downloads 24412425 Artificial Bee Colony Based Modified Energy Efficient Predictive Routing in MANET
Authors: Akhil Dubey, Rajnesh Singh
Abstract:
In modern days there occur many rapid modifications in field of ad hoc network. These modifications create many revolutionary changes in the routing. Predictive energy efficient routing is inspired on the bee’s behavior of swarm intelligence. Predictive routing improves the efficiency of routing in the energetic point of view. The main aim of this routing is the minimum energy consumption during communication and maximized intermediate node’s remaining battery power. This routing is based on food searching behavior of bees. There are two types of bees for the exploration phase the scout bees and for the evolution phase forager bees use by this routing. This routing algorithm computes the energy consumption, fitness ratio and goodness of the path. In this paper we review the literature related with predictive routing, presenting modified routing and simulation result of this algorithm comparison with artificial bee colony based routing schemes in MANET and see the results of path fitness and probability of fitness.Keywords: mobile ad hoc network, artificial bee colony, PEEBR, modified predictive routing
Procedia PDF Downloads 41612424 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques
Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas
Abstract:
The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining
Procedia PDF Downloads 12112423 AI and the Future of Misinformation: Opportunities and Challenges
Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi
Abstract:
Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation
Procedia PDF Downloads 9212422 Saving Energy at a Wastewater Treatment Plant through Electrical and Production Data Analysis
Authors: Adriano Araujo Carvalho, Arturo Alatrista Corrales
Abstract:
This paper intends to show how electrical energy consumption and production data analysis were used to find opportunities to save energy at Taboada wastewater treatment plant in Callao, Peru. In order to access the data, it was used independent data networks for both electrical and process instruments, which were taken to analyze under an ISO 50001 energy audit, which considered, thus, Energy Performance Indexes for each process and a step-by-step guide presented in this text. Due to the use of aforementioned methodology and data mining techniques applied on information gathered through electronic multimeters (conveniently placed on substation switchboards connected to a cloud network), it was possible to identify thoroughly the performance of each process and thus, evidence saving opportunities which were previously hidden before. The data analysis brought both costs and energy reduction, allowing the plant to save significant resources and to be certified under ISO 50001.Keywords: energy and production data analysis, energy management, ISO 50001, wastewater treatment plant energy analysis
Procedia PDF Downloads 19412421 Diagnostic Assessment for Mastery Learning of Engineering Students with a Bayesian Network Model
Authors: Zhidong Zhang, Yingchen Yang
Abstract:
In this study, a diagnostic assessment model for Mastery Engineering Learning was established based on a group of undergraduate students who studied in an engineering course. A diagnostic assessment model can examine both students' learning process and report achievement results. One very unique characteristic is that the diagnostic assessment model can recognize the errors and anything blocking students in their learning processes. The feedback is provided to help students to know how to solve the learning problems with alternative strategies and help the instructor to find alternative pedagogical strategies in the instructional designs. Dynamics is a core course in which is a common course being shared by several engineering programs. This course is a very challenging for engineering students to solve the problems. Thus knowledge acquisition and problem-solving skills are crucial for student success. Therefore, developing an effective and valid assessment model for student learning are of great importance. Diagnostic assessment is such a model which can provide effective feedback for both students and instructor in the mastery of engineering learning.Keywords: diagnostic assessment, mastery learning, engineering, bayesian network model, learning processes
Procedia PDF Downloads 15212420 Information Communication Technology (ICT) Using Management in Nursing College under the Praboromarajchanok Institute
Authors: Suphaphon Udomluck, Pannathorn Chachvarat
Abstract:
Information Communication Technology (ICT) using management is essential for effective decision making in organization. The Concerns Based Adoption Model (CBAM) was employed as the conceptual framework. The purposes of the study were to assess the situation of Information Communication Technology (ICT) using management in College of Nursing under the Praboromarajchanok Institute. The samples were multi – stage sampling of 10 colleges of nursing that participated include directors, vice directors, head of learning groups, teachers, system administrator and responsible for ICT. The total participants were 280; the instrument used were questionnaires that include 4 parts, general information, Information Communication Technology (ICT) using management, the Stage of concern Questionnaires (SoC), and the Levels of Use (LoU) ICT Questionnaires respectively. Reliability coefficients were tested; alpha coefficients were 0.967for Information Communication Technology (ICT) using management, 0.884 for SoC and 0.945 for LoU. The data were analyzed by frequency, percentage, mean, standard deviation, Pearson Product Moment Correlation and Multiple Regression. They were founded as follows: The high level overall score of Information Communication Technology (ICT) using management and issue were administration, hardware, software, and people. The overall score of the Stage of concern (SoC)ICTis at high level and the overall score of the Levels of Use (LoU) ICTis at moderate. The Information Communication Technology (ICT) using management had the positive relationship with the Stage of concern (SoC)ICTand the Levels of Use (LoU) ICT(p < .01). The results of Multiple Regression revealed that administration hardwear, software and people ware could predict SoC of ICT (18.5%) and LoU of ICT (20.8%).The factors that were significantly influenced by SoCs were people ware. The factors that were significantly influenced by LoU of ICT were administration hardware and people ware.Keywords: information communication technology (ICT), management, the concerns-based adoption model (CBAM), stage of concern(SoC), the levels of use(LoU)
Procedia PDF Downloads 31812419 The Role of Social Media on Political Behaviour in Malaysia
Authors: Ismail Sualman, Mohd Khairuddin Othman
Abstract:
General Election has been the backbone of democracy that permits people to choose their representatives as they deem fit. The support preferences of the voter differ from one to another, particularly in a plural society like Malaysia. The turning up of high numbers of young voters during the Malaysia 14th General Election has been said to have been caused by social media including Facebook, Twitter, WhatsApp, Instagram, YouTube and Telegram, WeChat and SMS/MMs. It has been observed that, besides using social media as an interaction tool among social friends, it is also an important source of information to know about issues, politics and politicians. This paper exhibits the role of social media in providing political information to young voters, before an election and during the election campaign. This study examines how this information is being translated into election support. A total of 799 Malay young respondents in Selangor have been surveyed and interviewed. This study revealed that social media has become the source of political information among Malay young voters. This research suggested that social media had a significant effect on the support during the election. Social media plays an important role in carrying information such as current issues, voting trends, candidate imagery and matters that may influence the view of young voters. The information obtained from social media has been translated into a voting decision.Keywords: social media, political behaviour, voters’ choice, election.
Procedia PDF Downloads 14612418 Climate Change and Tourism: A Scientometric Analysis Using Citespace
Authors: Yan Fang, Jie Yin, Bihu Wu
Abstract:
The interaction between climate change and tourism is one of the most promising research areas of recent decades. In this paper, a scientometric analysis of 976 academic publications between 1990 and 2015 related to climate change and tourism is presented in order to characterize the intellectual landscape by identifying and visualizing the evolution of the collaboration network, the co-citation network, and emerging trends of citation burst and keyword co-occurrence. The results show that the number of publications in this field has increased rapidly and it has become an interdisciplinary and multidisciplinary topic. The research areas are dominated by Australia, USA, Canada, New Zealand, and European countries, which have the most productive authors and institutions. The hot topics of climate change and tourism research in recent years are further identified, including the consequences of climate change for tourism, necessary adaptations, the vulnerability of the tourism industry, tourist behaviour and demand in response to climate change, and emission reductions in the tourism sector. The work includes an in-depth analysis of a major forum of climate change and tourism to help readers to better understand global trends in this field in the past 25 years.Keywords: climate change, tourism, scientometrics, CiteSpace
Procedia PDF Downloads 41512417 TimeTune: Personalized Study Plans Generation with Google Calendar Integration
Authors: Chevon Fernando, Banuka Athuraliya
Abstract:
The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.Keywords: personalized learning, study planner, time management, calendar integration
Procedia PDF Downloads 4912416 Structural Balance and Creative Tensions in New Product Development Teams
Authors: Shankaran Sitarama
Abstract:
New Product Development involves team members coming together and working in teams to come up with innovative solutions to problems, resulting in new products. Thus, a core attribute of a successful NPD team is their creativity and innovation. They need to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas, resulting in a POC (proof-of-concept) implementation or a prototype of the product. There are two distinctive traits that the teams need to have, one is ideational creativity, and the other is effective and efficient teamworking. There are multiple types of tensions that each of these traits cause in the teams, and these tensions reflect in the team dynamics. Ideational conflicts arising out of debates and deliberations increase the collective knowledge and affect the team creativity positively. However, the same trait of challenging each other’s viewpoints might lead the team members to be disruptive, resulting in interpersonal tensions, which in turn lead to less than efficient teamwork. Teams that foster and effectively manage these creative tensions are successful, and teams that are not able to manage these tensions show poor team performance. In this paper, it explore these tensions as they result in the team communication social network and propose a Creative Tension Balance index along the lines of Degree of Balance in social networks that has the potential to highlight the successful (and unsuccessful) NPD teams. Team communication reflects the team dynamics among team members and is the data set for analysis. The emails between the members of the NPD teams are processed through a semantic analysis algorithm (LSA) to analyze the content of communication and a semantic similarity analysis to arrive at a social network graph that depicts the communication amongst team members based on the content of communication. This social network is subjected to traditional social network analysis methods to arrive at some established metrics and structural balance analysis metrics. Traditional structural balance is extended to include team interaction pattern metrics to arrive at a creative tension balance metric that effectively captures the creative tensions and tension balance in teams. This CTB (Creative Tension Balance) metric truly captures the signatures of successful and unsuccessful (dissonant) NPD teams. The dataset for this research study includes 23 NPD teams spread out over multiple semesters and computes this CTB metric and uses it to identify the most successful and unsuccessful teams by classifying these teams into low, high and medium performing teams. The results are correlated to the team reflections (for team dynamics and interaction patterns), the team self-evaluation feedback surveys (for teamwork metrics) and team performance through a comprehensive team grade (for high and low performing team signatures).Keywords: team dynamics, social network analysis, new product development teamwork, structural balance, NPD teams
Procedia PDF Downloads 7912415 Efficient GIS Based Public Health System for Disease Prevention
Authors: K. M. G. T. R. Waidyarathna, S. M. Vidanagamachchi
Abstract:
Public Health System exists in Sri Lanka has a satisfactory complete information flow when compared to other systems in developing countries. The availability of a good health information system contributed immensely to achieve health indices that are in line with the developed countries like US and UK. The health information flow at the moment is completely paper based. In Sri Lanka, the fields like banking, accounting and engineering have incorporated information and communication technology to the same extent that can be observed in any other country. The field of medicine has behind those fields throughout the world mainly due to its complexity, issues like privacy, confidentially and lack of people with knowledge in both fields of Information Technology (IT) and Medicine. Sri Lanka’s situation is much worse and the gap is rapidly increasing with huge IT initiatives by private-public partnerships in all other countries. The major goal of the framework is to support minimizing the spreading diseases. To achieve that a web based framework should be implemented for this application domain with web mapping. The aim of this GIS based public health system is a secure, flexible, easy to maintain environment for creating and maintaining public health records and easy to interact with relevant parties.Keywords: DHIS2, GIS, public health, Sri Lanka
Procedia PDF Downloads 56412414 Expert System: Debugging Using MD5 Process Firewall
Authors: C. U. Om Kumar, S. Kishore, A. Geetha
Abstract:
An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization
Procedia PDF Downloads 42712413 Understanding Social Networks in Community's Coping Capacity with Floods: A Case Study of a Community in Cambodia
Authors: Ourn Vimoil, Kallaya Suntornvongsagul
Abstract:
Cambodia is considered as one of the most disaster prone countries in South East Asia, and most of natural disasters are related to floods. Cambodia, a developing country, faces significant impacts from floods, such as environmental, social, and economic losses. Using data accessed from focus group discussions and field surveys with villagers in Ba Baong commune, prey Veng province, Cambodia, the research would like to examine roles of social networks in raising community’s coping capacity with floods. The findings indicate that social capital play crucial roles in three stages of floods, namely preparedness, response, and recovery to overcome the crisis. People shared their information and resources, and extent their assistances to one another in order to adapt to floods. The study contribute to policy makers, national and international agencies working on this issue to pay attention on social networks as one factors to accelerate flood coping capacity at community level.Keywords: social network, community, coping capacity, flood, Cambodia
Procedia PDF Downloads 36512412 Studying Relationship between Local Geometry of Decision Boundary with Network Complexity for Robustness Analysis with Adversarial Perturbations
Authors: Tushar K. Routh
Abstract:
If inputs are engineered in certain manners, they can influence deep neural networks’ (DNN) performances by facilitating misclassifications, a phenomenon well-known as adversarial attacks that question networks’ vulnerability. Recent studies have unfolded the relationship between vulnerability of such networks with their complexity. In this paper, the distinctive influence of additional convolutional layers at the decision boundaries of several DNN architectures was investigated. Here, to engineer inputs from widely known image datasets like MNIST, Fashion MNIST, and Cifar 10, we have exercised One Step Spectral Attack (OSSA) and Fast Gradient Method (FGM) techniques. The aftermaths of adding layers to the robustness of the architectures have been analyzed. For reasoning, separation width from linear class partitions and local geometry (curvature) near the decision boundary have been examined. The result reveals that model complexity has significant roles in adjusting relative distances from margins, as well as the local features of decision boundaries, which impact robustness.Keywords: DNN robustness, decision boundary, local curvature, network complexity
Procedia PDF Downloads 7512411 Optimization of the Dam Management to Satisfy the Irrigation Demand: A Case Study in Algeria
Authors: Merouane Boudjerda, Bénina Touaibia, Mustapha K Mihoubi
Abstract:
In Algeria, water resources play a crucial role in economic development. But over the last decades, they are relatively limited and gradually decreasing to the detriment of agriculture. The agricultural irrigation is the primary water consuming sector followed by the domestic and industrial sectors. The research presented in this paper focuses on the optimization of irrigation water demand. Dynamic Programming-Neural Network (DPNN) method is applied to investigate reservoir optimization. The optimal operation rule is formulated to minimize the gap between water release and water irrigation demand. As a case study, Boukerdane dam’s reservoir system in North of Algeria has been selected to examine our proposed optimization model. The application of DPNN method allowed increasing the satisfaction rate (SR) from 34% to 60%. In addition, the operation rule generated showed more reliable and resilience operation for the examined case study.Keywords: water management, agricultural demand, Boukerdane dam, dynamic programming, artificial neural network
Procedia PDF Downloads 13112410 Prediction of Compressive Strength Using Artificial Neural Network
Authors: Vijay Pal Singh, Yogesh Chandra Kotiyal
Abstract:
Structures are a combination of various load carrying members which transfer the loads to the foundation from the superstructure safely. At the design stage, the loading of the structure is defined and appropriate material choices are made based upon their properties, mainly related to strength. The strength of materials kept on reducing with time because of many factors like environmental exposure and deformation caused by unpredictable external loads. Hence, to predict the strength of materials used in structures, various techniques are used. Among these techniques, Non-Destructive Techniques (NDT) are the one that can be used to predict the strength without damaging the structure. In the present study, the compressive strength of concrete has been predicted using Artificial Neural Network (ANN). The predicted strength was compared with the experimentally obtained actual compressive strength of concrete and equations were developed for different models. A good co-relation has been obtained between the predicted strength by these models and experimental values. Further, the co-relation has been developed using two NDT techniques for prediction of strength by regression analysis. It was found that the percentage error has been reduced between the predicted strength by using combined techniques in place of single techniques.Keywords: rebound, ultra-sonic pulse, penetration, ANN, NDT, regression
Procedia PDF Downloads 42812409 An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security
Authors: Ahlem Fatnassi, Hamza Gharsellaoui, Sadok Bouamama
Abstract:
This paper deals with the study of interest in the fields of Steganography and Steganalysis. Steganography involves hiding information in a cover media to obtain the stego media in such a way that the cover media is perceived not to have any embedded message for its unintended recipients. Steganalysis is the mechanism of detecting the presence of hidden information in the stego media and it can lead to the prevention of disastrous security incidents. In this paper, we provide a critical review of the steganalysis algorithms available to analyze the characteristics of an image stego media against the corresponding cover media and understand the process of embedding the information and its detection. We anticipate that this paper can also give a clear picture of the current trends in steganography so that we can develop and improvise appropriate steganalysis algorithms.Keywords: optimization, heuristics and metaheuristics algorithms, embedded systems, low-power consumption, steganalysis heuristic approach
Procedia PDF Downloads 29212408 In situ Polymerization and Properties of Biobased Polyurethane/Epoxy Interpenetrating Network Nanocomposites
Authors: Aiswarea Mathew, Smita Mohanty, Jr., S. K. Nayak
Abstract:
Polyurethane networks based on castor oil (CO) as a renewable resource polyol were synthesized. Polyurethane/epoxy resin interpenetrating network nanocomposites containing modified montmorillonite organoclay (C30B-PU/EP nanocomposites) were prepared by an in situ intercalation method. The conventional spectroscopic characterization of the synthesized samples using FT-IR confirms the existence of the proposed castor oil based PU structure and also showed that strong interactions existed between C30B and EP/PU matrix. The dispersion degree of C30B in EP/PU matrix was characterized by X-Ray diffraction (XRD) method. Scanning electronic microscopy analysis showed that the interpenetrating process of PU and EP increases the exfoliation degree of C30B, and it improves the compatibility and the phase structure of polyurethane/epoxy resin interpenetrating polymer networks (PU/EP IPNs). The thermal stability improves compared to the polyurethane when the PU/EP IPN is formed. Mechanical properties including the Young’s modulus and tensile strength reflected marked improvement with addition of C30B.Keywords: castor oil, epoxy, montmorillonite, polyurethane
Procedia PDF Downloads 40012407 Reversible Information Hitting in Encrypted JPEG Bitstream by LSB Based on Inherent Algorithm
Authors: Vaibhav Barve
Abstract:
Reversible information hiding has drawn a lot of interest as of late. Being reversible, we can restore unique computerized data totally. It is a plan where mystery data is put away in digital media like image, video, audio to maintain a strategic distance from unapproved access and security reason. By and large JPEG bit stream is utilized to store this key data, first JPEG bit stream is encrypted into all around sorted out structure and then this secret information or key data is implanted into this encrypted region by marginally changing the JPEG bit stream. Valuable pixels suitable for information implanting are computed and as indicated by this key subtle elements are implanted. In our proposed framework we are utilizing RC4 algorithm for encrypting JPEG bit stream. Encryption key is acknowledged by framework user which, likewise, will be used at the time of decryption. We are executing enhanced least significant bit supplanting steganography by utilizing genetic algorithm. At first, the quantity of bits that must be installed in a guaranteed coefficient is versatile. By utilizing proper parameters, we can get high capacity while ensuring high security. We are utilizing logistic map for shuffling of bits and utilization GA (Genetic Algorithm) to find right parameters for the logistic map. Information embedding key is utilized at the time of information embedding. By utilizing precise picture encryption and information embedding key, the beneficiary can, without much of a stretch, concentrate the incorporated secure data and totally recoup the first picture and also the original secret information. At the point when the embedding key is truant, the first picture can be recouped pretty nearly with sufficient quality without getting the embedding key of interest.Keywords: data embedding, decryption, encryption, reversible data hiding, steganography
Procedia PDF Downloads 28812406 The Role of Strategic Alliances, Innovation Capability, Cost Reduction in Enhancing Customer Loyalty and Firm’s Competitive Advantage
Authors: Soebowo Musa
Abstract:
Mining industries are known to be very volatile due to their sensitive nature toward changes in the environment, particularly coal mining. Heavy equipment distributors and coal mining contractors are among heavily affected by such volatility. They are facing more uncertainty on the sustainability of the coal mining industry. Strategic alliances and organizational capabilities such as innovation capability have long been seen as ways to stay competitive with a focus more on the strategic alliances partner-to-partner in serving their customers. In today’s rapid change in the environment, a shift in consumer behaviors, and the human-centric business approach, this study looks at the strategic alliance partner-to-customer relationship in both the industrial organization and resource-based theories. This study was conducted based on 250 respondents from the strategic alliances partner-to-customer between heavy equipment distributors and coal mining contractors in Indonesia. This study finds strategic alliances have the highest association toward cost reduction, a proxy of operational efficiency followed by its association toward innovation capability. Further, strategic alliances and innovation capability have a positive relationship with customer loyalty, while innovation capability and customer loyalty have no significant relationships toward the firm’s competitive advantage. This study also indicates that cost reduction is not a condition to develop customer loyalty in the strategic alliance partner-to-customer relationship. It confirms strategic alliances are a strategy that creates a firm’s operational efficiency, innovation capability that develops customer loyalty, and competitive advantage.Keywords: strategic alliance, innovation capability, cost reduction, customer loyalty, competitive advantage
Procedia PDF Downloads 11912405 Building Transparent Supply Chains through Digital Tracing
Authors: Penina Orenstein
Abstract:
In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.Keywords: data mining, supply chain, empirical research, data mapping
Procedia PDF Downloads 175