Search results for: digital tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6381

Search results for: digital tools

5391 An Integrated Architecture of E-Learning System to Digitize the Learning Method

Authors: M. Touhidul Islam Sarker, Mohammod Abul Kashem

Abstract:

The purpose of this paper is to improve the e-learning system and digitize the learning method in the educational sector. The learner will login into e-learning platform and easily access the digital content, the content can be downloaded and take an assessment for evaluation. Learner can get access to these digital resources by using tablet, computer, and smart phone also. E-learning system can be defined as teaching and learning with the help of multimedia technologies and the internet by access to digital content. E-learning replacing the traditional education system through information and communication technology-based learning. This paper has designed and implemented integrated e-learning system architecture with University Management System. Moodle (Modular Object-Oriented Dynamic Learning Environment) is the best e-learning system, but the problem of Moodle has no school or university management system. In this research, we have not considered the school’s student because they are out of internet facilities. That’s why we considered the university students because they have the internet access and used technologies. The University Management System has different types of activities such as student registration, account management, teacher information, semester registration, staff information, etc. If we integrated these types of activity or module with Moodle, then we can overcome the problem of Moodle, and it will enhance the e-learning system architecture which makes effective use of technology. This architecture will give the learner to easily access the resources of e-learning platform anytime or anywhere which digitizes the learning method.

Keywords: database, e-learning, LMS, Moodle

Procedia PDF Downloads 188
5390 Assessing Language Dominance in Mexican Deaf Signers with the Bilingual Language Profile (BLP)

Authors: E. Mendoza, D. Jackson-Maldonado, G. Avecilla-Ramírez, A. Mondaca

Abstract:

Assessing language proficiency is a major issue in psycholinguistic research. There are multiple tools that measure language dominance and language proficiency in hearing bilinguals, however, this is not the case for Deaf bilinguals. Specifically, there are few, if not none, assessment tools useful in the description of the multilingual abilities of Mexican Deaf signers. Because of this, the linguistic characteristics of Mexican Deaf population have been poorly described. This paper attempts to explain the necessary changes done in order to adapt the Bilingual Language Profile (BLP) to Mexican Sign Language (LSM) and written/oral Spanish. BLP is a Self-Evaluation tool that has been adapted and translated to several oral languages, but not to sign languages. Lexical, syntactic, cultural, and structural changes were applied to the BLP. 35 Mexican Deaf signers participated in a pilot study. All of them were enrolled in Higher Education programs. BLP was presented online in written Spanish via Google Forms. No additional information in LSM was provided. Results show great heterogeneity as it is expected of Deaf populations and BLP seems to be a useful tool to create a bilingual profile of the Mexican Deaf population. This is a first attempt to adapt a widely tested tool in bilingualism research to sign language. Further modifications need to be done.

Keywords: deaf bilinguals, assessment tools, bilingual language profile, mexican sign language

Procedia PDF Downloads 153
5389 How Can Food Retailing Benefit from Neuromarketing Research: The Influence of Traditional and Innovative Tools of In-Store Communication on Consumer Reactions

Authors: Jakub Berčík, Elena Horská, Ľudmila Nagyová

Abstract:

Nowadays, the point of sale remains one of the few channels of communication which is not oversaturated yet and has great potential for the future. The fact that purchasing decisions are significantly affected by emotions, while up to 75 % of them are implemented at the point of sale, only demonstrates its importance. The share of impulsive purchases is about 60-75 %, depending on the particular product category. Nevertheless, habits predetermine the content of the shopping cart above all and hence in this regard the role of in-store communication is to disrupt the routine and compel the customer to try something new. This is the reason why it is essential to know how to work with this relatively young branch of marketing communication as efficiently as possible. New global trend in this discipline is evaluating the effectiveness of particular tools in the in-store communication. To increase the efficiency it is necessary to become familiar with the factors affecting the customer both consciously and unconsciously, and that is a task for neuromarketing and sensory marketing. It is generally known that the customer remembers the negative experience much longer and more intensely than the positive ones, therefore it is essential for marketers to avoid this negative experience. The final effect of POP (Point of Purchase) or POS (Point of Sale) tools is conditional not only on their quality and design, but also on the location at the point of sale which contributes to the overall positive atmosphere in the store. Therefore, in-store advertising is increasingly in the center of attention and companies are willing to spend even a third of their marketing communication budget on it. The paper deals with a comprehensive, interdisciplinary research of the impact of traditional as well as innovative tools of in-store communication on the attention and emotional state (valence and arousal) of consumers on the food market. The research integrates measurements with eye camera (Eye tracker) and electroencephalograph (EEG) in real grocery stores as well as in laboratory conditions with the purpose of recognizing attention and emotional response among respondents under the influence of selected tools of in-store communication. The object of the research includes traditional (e.g. wobblers, stoppers, floor graphics) and innovative (e.g. displays, wobblers with LED elements, interactive floor graphics) tools of in-store communication in the fresh unpackaged food segment. By using a mobile 16-channel electroencephalograph (EEG equipment) from the company EPOC, a mobile eye camera (Eye tracker) from the company Tobii and a stationary eye camera (Eye tracker) from the company Gazepoint, we observe the attention and emotional state (valence and arousal) to reveal true consumer preferences using traditional and new unusual communication tools at the point of sale of the selected foodstuffs. The paper concludes with suggesting possibilities for rational, effective and energy-efficient combination of in-store communication tools, by which the retailer can accomplish not only captivating and attractive presentation of displayed goods, but ultimately also an increase in retail sales of the store.

Keywords: electroencephalograph (EEG), emotion, eye tracker, in-store communication

Procedia PDF Downloads 392
5388 Validating Thermal Performance of Existing Wall Assemblies Using In-Situ Measurements

Authors: Shibei Huang

Abstract:

In deep energy retrofits, the thermal performance of existing building envelopes is often difficult to determine with a high level of accuracy. For older buildings, the records of existing assemblies are often incomplete or inaccurate. To obtain greater baseline performance accuracy for energy models, in-field measurement tools can be used to obtain data on the thermal performance of the existing assemblies. For a known assembly, these field measurements assist in validating the U-factor estimates. If the field-measured U-factor consistently varies from the calculated prediction, those measurements prompt further study. For an unknown assembly, successful field measurements can provide approximate U-factor evaluation, validate assumptions, or identify anomalies requiring further investigation. Using case studies, this presentation will focus on the non-destructive methods utilizing a set of various field tools to validate the baseline U-factors for a range of existing buildings with various wall assemblies. The lessons learned cover what can be achieved, the limitations of these approaches and tools, and ideas for improving the validity of measurements. Key factors include the weather conditions, the interior conditions, the thermal mass of the measured assemblies, and the thermal profiles of the assemblies in question.

Keywords: existing building, sensor, thermal analysis, retrofit

Procedia PDF Downloads 63
5387 Multifunctional Polydopamine-Silver-Polydopamine Nanofilm With Applications in Digital Microfluidics and SERS

Authors: Yilei Xue, Yat-Hing Ham, Wenting Qiu, Wan Chan, Stefan Nagl

Abstract:

Polydopamine (PDA) is a popular material in biological and medical applications due to its excellent biocompatibility, outstanding physicochemical properties, and facile fabrication. In this project, a new sandwich-structured PDA and silver (Ag) hybrid material named PDA-Ag-PDA was synthesized and characterized layer-by-layer, where silver nanoparticles (Ag NPs) are wrapped in PDA coatings, using SEM, AFM, 3D surface metrology, and contact angle meter. The silver loading capacity is positively proportional to the roughness value of the initial PDA film. This designed film was subsequently integrated within a digital microfluidic (DMF) platform coupling with an oxygen sensor layer for on-chip antibacterial assay. The concentration of E. coli was quantified on DMF by real-time monitoring oxygen consumption during E. coli growth with the optical oxygen sensor layer. The PDA-Ag-PDA coating shows an 99.9% reduction in E. coli population under non-nutritive condition with 1-hour treatment and has a strong growth inhibition of E. coliin nutrient LB broth as well. Furthermore, PDA-Ag-PDA film maintaining a low cytotoxicity effect to human cells. After treating with PDA-Ag-PDA film for 24 hours, 82% HEK 293 and 86% HeLa cells were viable. The SERS enhancement factor of PDA-Ag-PDA is estimated to be 1.9 × 104 using Rhodamine 6G (R6G). Multifunctional PDA-Ag-PDA coating provides an alternative platform to conjugate biomolecules and perform biological applications on DMF, in particular, for the adhesive protein and cell study.

Keywords: polydopamine, silver nanoparticles, digital microfluidic, optical sensor, antimicrobial assay, SERS

Procedia PDF Downloads 93
5386 Voices and Pictures from an Online Course and a Face to Face Course

Authors: Eti Gilad, Shosh Millet

Abstract:

In light of the technological development and its introduction into the field of education, an online course was designed in parallel to the 'conventional' course for teaching the ''Qualitative Research Methods''. This course aimed to characterize learning-teaching processes in a 'Qualitative Research Methods' course studied in two different frameworks. Moreover its objective was to explore the difference between the culture of a physical learning environment and that of online learning. The research monitored four learner groups, a total of 72 students, for two years, two groups from the two course frameworks each year. The courses were obligatory for M.Ed. students at an academic college of education and were given by one female-lecturer. The research was conducted in the qualitative method as a case study in order to attain insights about occurrences in the actual contexts and sites in which they transpire. The research tools were open-ended questionnaire and reflections in the form of vignettes (meaningful short pictures) to all students as well as an interview with the lecturer. The tools facilitated not only triangulation but also collecting data consisting of voices and pictures of teaching and learning. The most prominent findings are: differences between the two courses in the change features of the learning environment culture for the acquisition of contents and qualitative research tools. They were manifested by teaching methods, illustration aids, lecturer's profile and students' profile.

Keywords: face to face course, online course, qualitative research, vignettes

Procedia PDF Downloads 418
5385 Enhancing Academic Writing Through Artificial Intelligence: Opportunities and Challenges

Authors: Abubakar Abdulkareem, Nasir Haruna Soba

Abstract:

Artificial intelligence (AI) is developing at a rapid pace, revolutionizing several industries, including education. This talk looks at how useful AI can be for academic writing, with an emphasis on how it can help researchers be more accurate, productive, and creative. The academic world now relies heavily on AI technologies like grammar checkers, plagiarism detectors, and content generators to help with the writing, editing, and formatting of scholarly papers. This study explores the particular uses of AI in academic writing and assesses how useful and helpful these applications may be for both students and scholars. By means of an extensive examination of extant literature and a sequence of empirical case studies, we scrutinize the merits and demerits of artificial intelligence tools utilized in academic writing. Important discoveries indicate that although AI greatly increases productivity and lowers human error, there are still issues that need to be resolved, including reliance, ethical concerns, and the potential loss of critical thinking abilities. The talk ends with suggestions for incorporating AI tools into academic settings so that they enhance rather than take the place of the intellectual rigor that characterizes scholarly work. This study adds to the continuing conversation about artificial intelligence (AI) in higher education by supporting a methodical strategy that uses technology to enhance human abilities in academic writing.

Keywords: artificial intelligence, academic writing, ai tools, productivity, ethics, higher education

Procedia PDF Downloads 27
5384 The Impact of Cryptocurrency Classification on Money Laundering: Analyzing the Preferences of Criminals for Stable Coins, Utility Coins, and Privacy Tokens

Authors: Mohamed Saad, Huda Ismail

Abstract:

The purpose of this research is to examine the impact of cryptocurrency classification on money laundering crimes and to analyze how the preferences of criminals differ according to the type of digital currency used. Specifically, we aim to explore the roles of stablecoins, utility coins, and privacy tokens in facilitating or hindering money laundering activities and to identify the key factors that influence the choices of criminals in using these cryptocurrencies. To achieve our research objectives, we used a dataset for the most highly traded cryptocurrencies (32 currencies) that were published on the coin market cap for 2022. In addition to conducting a comprehensive review of the existing literature on cryptocurrency and money laundering, with a focus on stablecoins, utility coins, and privacy tokens, Furthermore, we conducted several Multivariate analyses. Our study reveals that the classification of cryptocurrency plays a significant role in money laundering activities, as criminals tend to prefer certain types of digital currencies over others, depending on their specific needs and goals. Specifically, we found that stablecoins are more commonly used in money laundering due to their relatively stable value and low volatility, which makes them less risky to hold and transfer. Utility coins, on the other hand, are less frequently used in money laundering due to their lack of anonymity and limited liquidity. Finally, privacy tokens, such as Monero and Zcash, are increasingly becoming a preferred choice among criminals due to their high degree of privacy and untraceability. In summary, our study highlights the importance of understanding the nuances of cryptocurrency classification in the context of money laundering and provides insights into the preferences of criminals in using digital currencies for illegal activities. Based on our findings, our recommendation to the policymakers is to address the potential misuse of cryptocurrencies for money laundering. By implementing measures to regulate stable coins, strengthening cross-border cooperation, fostering public-private partnerships, and increasing cooperation, policymakers can help prevent and detect money laundering activities involving digital currencies.

Keywords: crime, cryptocurrency, money laundering, tokens.

Procedia PDF Downloads 87
5383 Effects of Empathy Priming on Idea Generation

Authors: Tejas Dhadphale

Abstract:

The user-centered design (UCD) approach has led to an increased interest in empathy within the product development process. Designers have explored several empathetic methods and tools such as personas, empathy maps, journey maps, user needs statements and user scenarios to capture and visualize users’ needs. The goal of these tools is not only to generate a deeper and shared understanding of user needs but also to become a point of reference for subsequent decision making, brainstorming and concept evaluation tasks. The purpose of this study is to measure the effect of empathy priming on divergent brainstorming tasks. This study compares the effects of three empathy tools, personas, empathy maps and user needs statements, on ideation fluency and originality of ideas during brainstorming tasks. In a three-between-subjects experimental design study, sixty product design students were randomly assigned to one of three conditions: persona, empathy maps and user needs statements. A one-way, between-subjects analysis of variance (ANOVA) revealed a a statistically significant difference in empathy priming on fluency and originality of ideas. Participants in the persona group showed higher ideation fluency and generated a greater number of original ideas compared to the other groups. The results show that participants in the user need statement group to generate a greater number of feasible and relevant ideas. The study also aims to understand how formatting and visualization of empathy tools impact divergent brainstorming tasks. Participants were interviewed to understand how different visualizations of users’ needs (personas, empathy maps and user needs statements) facilitated idea generation during brainstorming tasks. Implications for design education are discussed.

Keywords: empathy, persona, priming, Design research

Procedia PDF Downloads 87
5382 Simo-syl: A Computer-Based Tool to Identify Language Fragilities in Italian Pre-Schoolers

Authors: Marinella Majorano, Rachele Ferrari, Tamara Bastianello

Abstract:

The recent technological advance allows for applying innovative and multimedia screen-based assessment tools to test children's language and early literacy skills, monitor their growth over the preschool years, and test their readiness for primary school. Several are the advantages that a computer-based assessment tool offers with respect to paper-based tools. Firstly, computer-based tools which provide the use of games, videos, and audio may be more motivating and engaging for children, especially for those with language difficulties. Secondly, computer-based assessments are generally less time-consuming than traditional paper-based assessments: this makes them less demanding for children and provides clinicians and researchers, but also teachers, with the opportunity to test children multiple times over the same school year and, thus, to monitor their language growth more systematically. Finally, while paper-based tools require offline coding, computer-based tools sometimes allow obtaining automatically calculated scores, thus producing less subjective evaluations of the assessed skills and provide immediate feedback. Nonetheless, using computer-based assessment tools to test meta-phonological and language skills in children is not yet common practice in Italy. The present contribution aims to estimate the internal consistency of a computer-based assessment (i.e., the Simo-syl assessment). Sixty-three Italian pre-schoolers aged between 4;10 and 5;9 years were tested at the beginning of the last year of the preschool through paper-based standardised tools in their lexical (Peabody Picture Vocabulary Test), morpho-syntactical (Grammar Repetition Test for Children), meta-phonological (Meta-Phonological skills Evaluation test), and phono-articulatory skills (non-word repetition). The same children were tested through Simo-syl assessment on their phonological and meta-phonological skills (e.g., recognise syllables and vowels and read syllables and words). The internal consistency of the computer-based tool was acceptable (Cronbach's alpha = .799). Children's scores obtained in the paper-based assessment and scores obtained in each task of the computer-based assessment were correlated. Significant and positive correlations emerged between all the tasks of the computer-based assessment and the scores obtained in the CMF (r = .287 - .311, p < .05) and in the correct sentences in the RCGB (r = .360 - .481, p < .01); non-word repetition standardised test significantly correlates with the reading tasks only (r = .329 - .350, p < .05). Further tasks should be included in the current version of Simo-syl to have a comprehensive and multi-dimensional approach when assessing children. However, such a tool represents a good chance for the teachers to early identifying language-related problems even in the school environment.

Keywords: assessment, computer-based, early identification, language-related skills

Procedia PDF Downloads 183
5381 Phishing Attacks Facilitated by Open Source Intelligence

Authors: Urva Maryam

Abstract:

Information has become an important asset to the current cosmos. Globally, various tactics are being observed to confine the spread of information as it makes people vulnerable to security attacks. Open Source Intelligence (OSINT) is a publicly available source that has disseminated information about users or website, companies, and various organizations. This paper focuses on the quantitative method of exploring various OSINT tools that reveal public information of personals. This information could further facilitate the phishing attacks. Phishing attacks can be launched on email addresses, open ports, and unsecured web-surfing. This study allows to analyze information retrieved from OSINT tools i.e., the Harvester, and Maltego, that can be used to send phishing attacks to individuals.

Keywords: OSINT, phishing, spear phishing, email spoofing, the harvester, maltego

Procedia PDF Downloads 81
5380 Research Action Fields at the Nexus of Digital Transformation and Supply Chain Management: Findings from Practitioner Focus Group Workshops

Authors: Brandtner Patrick, Staberhofer Franz

Abstract:

Logistics and Supply Chain Management are of crucial importance for organisational success. In the era of Digitalization, several implications and improvement potentials for these domains arise, which at the same time could lead to decreased competitiveness and could endanger long-term company success if ignored or neglected. However, empirical research on the issue of Digitalization and benefits purported to it by practitioners is scarce and mainly focused on single technologies or separate, isolated Supply Chain blocks as e.g. distribution logistics or procurement only. The current paper applies a holistic focus group approach to elaborate practitioner use cases at the nexus of the concepts of Supply Chain Management (SCM) and Digitalization. In the course of three focus group workshops with over 45 participants from more than 20 organisations, a comprehensive set of benefit entitlements and areas for improvement in terms of applying digitalization to SCM is developed. The main results of the paper indicate the relevance of Digitalization being realized in practice. In the form of seventeen concrete research action fields, the benefit entitlements are aggregated and transformed into potential starting points for future research projects in this area. The main contribution of this paper is an empirically grounded basis for future research projects and an overview of actual research action fields from practitioners’ point of view.

Keywords: digital supply chain, digital transformation, supply chain management, value networks

Procedia PDF Downloads 177
5379 TDApplied: An R Package for Machine Learning and Inference with Persistence Diagrams

Authors: Shael Brown, Reza Farivar

Abstract:

Persistence diagrams capture valuable topological features of datasets that other methods cannot uncover. Still, their adoption in data pipelines has been limited due to the lack of publicly available tools in R (and python) for analyzing groups of them with machine learning and statistical inference. In an easy-to-use and scalable R package called TDApplied, we implement several applied analysis methods tailored to groups of persistence diagrams. The two main contributions of our package are comprehensiveness (most functions do not have implementations elsewhere) and speed (shown through benchmarking against other R packages). We demonstrate applications of the tools on simulated data to illustrate how easily practical analyses of any dataset can be enhanced with topological information.

Keywords: machine learning, persistence diagrams, R, statistical inference

Procedia PDF Downloads 86
5378 Development of Digital Twin Concept to Detect Abnormal Changes in Structural Behaviour

Authors: Shady Adib, Vladimir Vinogradov, Peter Gosling

Abstract:

Digital Twin (DT) technology is a new technology that appeared in the early 21st century. The DT is defined as the digital representation of living and non-living physical assets. By connecting the physical and virtual assets, data are transmitted smoothly, allowing the virtual asset to fully represent the physical asset. Although there are lots of studies conducted on the DT concept, there is still limited information about the ability of the DT models for monitoring and detecting unexpected changes in structural behaviour in real time. This is due to the large computational efforts required for the analysis and an excessively large amount of data transferred from sensors. This paper aims to develop the DT concept to be able to detect the abnormal changes in structural behaviour in real time using advanced modelling techniques, deep learning algorithms, and data acquisition systems, taking into consideration model uncertainties. finite element (FE) models were first developed offline to be used with a reduced basis (RB) model order reduction technique for the construction of low-dimensional space to speed the analysis during the online stage. The RB model was validated against experimental test results for the establishment of a DT model of a two-dimensional truss. The established DT model and deep learning algorithms were used to identify the location of damage once it has appeared during the online stage. Finally, the RB model was used again to identify the damage severity. It was found that using the RB model, constructed offline, speeds the FE analysis during the online stage. The constructed RB model showed higher accuracy for predicting the damage severity, while deep learning algorithms were found to be useful for estimating the location of damage with small severity.

Keywords: data acquisition system, deep learning, digital twin, model uncertainties, reduced basis, reduced order model

Procedia PDF Downloads 99
5377 Project Management Tools within SAP S/4 Hana Program Environment

Authors: Jagoda Bruni, Jan Müller-Lucanus, Gernot Stöger-Knes

Abstract:

The purpose of this article is to demonstrate modern project management approaches in the SAP S/R Hana surrounding a programming environment composed of multiple focus-diversified projects. We would like to propose innovative and goal-oriented management standards based on the specificity of the SAP transformations and customer-driven expectations. Due to the regular sprint-based controlling and management tools' application, it has been data-proven that extensive analysis of productive hours of the employees as much as a thorough review of the project progress (per GAP, per business process, and per Lot) within the whole program, can have a positive impact on customer satisfaction and improvement for projects' budget. This has been a collaborative study based on real-life experience and measurements in collaboration with our customers.

Keywords: project management, program management, SAP, controlling

Procedia PDF Downloads 91
5376 Effect of Implementing a Teaching Module about Diet and Exercises on Clinical Outcomes of Patients with Gout

Authors: Wafaa M. El- Kotb, Soheir Mohamed Weheida, Manal E. Fareed

Abstract:

The aim of this study was to determine the effect of implementing a teaching module about diet and exercises on clinical outcomes of patients with gout. Subjects: A purposive sample of 60 adult gouty patients was selected and randomly and alternatively divided into two equal groups 30 patients in each. Setting: The study was conducted in orthopedic out patient's clinic of Menoufia University. Tools of the study: Three tools were utilized for data collection: Knowledge assessment structured interview questionnaire, Clinical manifestation assessment tools and Nutritional assessment sheet. Results: All patients of both groups (100 %) had poor total knowledge score pre teaching, while 90 % of the study group had good total knowledge score post teaching by three months compared to 3.3 % of the control group. Moreover the recovery outcomes were significantly improved among study group compared to control group post teaching. Conclusion: Teaching study group about diet and exercises significantly improved their clinical outcomes. Recommendation: Patient's education about diet and exercises should be ongoing process for patients with gout.

Keywords: clinical outcomes, diet, exercises, teaching module

Procedia PDF Downloads 346
5375 Checking Energy Efficiency by Simulation Tools: The Case of Algerian Ksourian Models

Authors: Khadidja Rahmani, Nahla Bouaziz

Abstract:

Algeria is known for its rich heritage. It owns an immense historical heritage with a universal reputation. Unfortunately, this wealth is withered because of abundance. This research focuses on the Ksourian model, which constitutes a large portion of this wealth. In fact, the Ksourian model is not just a witness to a great part of history or a vernacular culture, but also it includes a panoply of assets in terms of energetic efficiency. In this context, the purpose of our work is to evaluate the performance of the old techniques which are derived from the Ksourian model , and that using the simulation tools. The proposed method is decomposed in two steps; the first consists of isolate and reintroduce each device into a basic model, then run a simulation series on acquired models. And this in order to test the contribution of each of these dialectal processes. In another scale of development, the second step consists of aggregating all these processes in an aboriginal model, then we restart the simulation, to see what it will give this mosaic on the environmental and energetic plan .The model chosen for this study is one of the ksar units of Knadsa city of Bechar (Algeria). This study does not only show the ingenuity of our ancestors in their know-how, and their adapting power to the aridity of the climate, but also proves that their conceptions subscribe in the current concerns of energy efficiency, and respond to the requirements of sustainable development.

Keywords: dialectal processes, energy efficiency, evaluation, Ksourian model, simulation tools

Procedia PDF Downloads 195
5374 Towards Addressing the Cultural Snapshot Phenomenon in Cultural Mapping Libraries

Authors: Mousouris Spiridon, Kavakli Evangelia

Abstract:

This paper focuses on Digital Libraries (DLs) that contain and geovisualise cultural data, highlighting the need to define them as a separate category termed Cultural Mapping Libraries, based on their inherent connection of culture with geographic location and their design requirements in support of visual representation of cultural data on the map. An exploratory analysis of DLs that conform to the above definition brought forward the observation that existing Cultural Mapping Libraries fail to geovisualise the entirety of cultural data per point of interest thus resulting in a Cultural Snapshot phenomenon. The existence of this phenomenon was reinforced by the results of a systematic bibliographic research. In order to address the Cultural Snapshot, this paper proposes the use of the Semantic Web principles to efficiently interconnect spatial cultural data through time, per geographic location. In this way points of interest are transformed into scenery where culture evolves over time. This evolution is expressed as occurrences taking place chronologically, in an event oriented approach, a conceptualization also endorsed by the CIDOC Conceptual Reference Model (CIDOC CRM). In particular, we posit the use of CIDOC CRM as the baseline for defining the logic of Cultural Mapping Libraries as part of the Culture Domain in accordance with the Digital Library Reference Model, in order to define the rules of cultural data management by the system. Our future goal is to transform this conceptual definition in to inferencing rules that resolve the Cultural Snapshot and lead to a more complete geovisualisation of cultural data.

Keywords: digital libraries, semantic web, geovisualization, CIDOC-CRM

Procedia PDF Downloads 109
5373 Component Lifecycle and Concurrency Model in Usage Control (UCON) System

Authors: P. Ghann, J. Shiguang, C. Zhou

Abstract:

Access control is one of the most challenging issues facing information security. Access control is defined as, the ability to permit or deny access to a particular computational resource or digital information by an unauthorized user or subject. The concept of usage control (UCON) has been introduced as a unified approach to capture a number of extensions for access control models and systems. In UCON, an access decision is determined by three factors: Authorizations, obligations and conditions. Attribute mutability and decision continuity are two distinct characteristics introduced by UCON for the first time. An observation of UCON components indicates that, the components are predefined and static. In this paper, we propose a new and flexible model of usage control for the creation and elimination of some of these components; for example new objects, subjects, attributes and integrate these with the original UCON model. We also propose a model for concurrent usage scenarios in UCON.

Keywords: access control, concurrency, digital container, usage control

Procedia PDF Downloads 321
5372 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 86
5371 Examples of RC Design with Eurocode2

Authors: Carla Ferreira, Helena Barros

Abstract:

The paper termed “Design of reinforced concrete with Eurocode 2” presents the theory regarding the design of reinforced concrete sections and the development of the tables and abacuses to verify the concrete section to the ultimate limit and service limit states. This paper is a complement of it, showing how to use the previous tools. Different numerical results are shown, proving the capability of the methodology. When a section of a beam is already chosen, the computer program presents the reinforcing steel in many locations along the structure, and it is the engineer´s task to choose the layout available for the construction, considering the maximum regular kind of reinforcing bars. There are many computer programs available for this task, but the interest of the present kind of tools is the fast and easy way of making the design and choose the optimal solution. Another application of these design tools is in the definition of the section dimensions, in a way that when stresses are evaluated, the final design is acceptable. In the design offices, these are considered by the engineers a very quick and useful way of designing reinforced concrete sections, employing variable strength concrete and higher steel classes. Examples of nonlinear analyses and redistribution of the bending moment will be considered, according to the Eurocode 2 recommendations, for sections under bending moment and axial forces. Examples of the evaluation of the service limit state will be presented.

Keywords: design examples, eurocode 2, reinforced concrete, section design

Procedia PDF Downloads 72
5370 Automatic Identification of Pectoral Muscle

Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina

Abstract:

Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.

Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle

Procedia PDF Downloads 350
5369 Characterization of Inertial Confinement Fusion Targets Based on Transmission Holographic Mach-Zehnder Interferometer

Authors: B. Zare-Farsani, M. Valieghbal, M. Tarkashvand, A. H. Farahbod

Abstract:

To provide the conditions for nuclear fusion by high energy and powerful laser beams, it is required to have a high degree of symmetry and surface uniformity of the spherical capsules to reduce the Rayleigh-Taylor hydrodynamic instabilities. In this paper, we have used the digital microscopic holography based on Mach-Zehnder interferometer to study the quality of targets for inertial fusion. The interferometric pattern of the target has been registered by a CCD camera and analyzed by Holovision software. The uniformity of the surface and shell thickness are investigated and measured in reconstructed image. We measured shell thickness in different zone where obtained non uniformity 22.82 percent.  

Keywords: inertial confinement fusion, mach-zehnder interferometer, digital holographic microscopy, image reconstruction, holovision

Procedia PDF Downloads 304
5368 Analyze Long-Term Shoreline Change at Yi-Lan Coast, Taiwan Using Multiple Sources

Authors: Geng-Gui Wang, Chia-Hao Chang, Jee-Cheng Wu

Abstract:

A shoreline is a line where a body of water and the shore meet. It provides economic and social security to coastal habitations. However, shorelines face multiple threats due to both natural processes and man-made effects because of disasters, rapid urbanization, industrialization, and sand deposition and erosion, etc. In this study, we analyzed multi-temporal satellite images of the Yilan coast, Taiwan from 1978 to 2016, using the United States Geological Survey (USGS) Digital Shoreline Analysis System (DSAS), weather information (as rainfall records and typhoon routes), and man-made construction project data to explore the causes of shoreline changes. The results showed that the shoreline at Yilan coast is greatly influenced by typhoons and anthropogenic interventions.

Keywords: shoreline change, multi-temporal satellite, digital shoreline analysis system, DSAS, Yi-Lan coast

Procedia PDF Downloads 163
5367 Next-Viz: A Literature Review and Web-Based Visualization Tool Proposal

Authors: Railly Hugo, Igor Aguilar-Alonso

Abstract:

Software visualization is a powerful tool for understanding complex software systems. However, current visualization tools often lack features or are difficult to use, limiting their effectiveness. In this paper, we present next-viz, a proposed web-based visualization tool that addresses these challenges. We provide a literature review of existing software visualization techniques and tools and describe the architecture of next-viz in detail. Our proposed tool incorporates state-of-the-art visualization techniques and is designed to be user-friendly and intuitive. We believe next-viz has the potential to advance the field of software visualization significantly.

Keywords: software visualization, literature review, tool proposal, next-viz, web-based, architecture, visualization techniques, user-friendly, intuitive

Procedia PDF Downloads 83
5366 Determinants of Artificial Intelligence Capabilities in Healthcare: The Case of Ethiopia

Authors: Dereje Ferede, Solomon Negash

Abstract:

Artificial Intelligence (AI) is a key enabler and driver to transform and revolutionize the healthcare industries. However, utilizing AI and achieving these benefits is challenging for different sectors in wide-ranging, more difficult for developing economy healthcare. Due to this, real-world clinical execution and implementation of AI have not yet aged. We believe that examining the determinants is key to addressing these challenges. Furthermore, the literature does not yet particularize determinants of AI capabilities and ways of empowering the healthcare ecosystem to develop AI capabilities in a developing economy. Thus, this study aims to position AI as a digital transformation weapon for the healthcare ecosystem by examining AI capability determinants and providing insights on better empowering the healthcare industry to develop AI capabilities. To do so, we base on the technology-organization-environment (TOE) model and will apply a mixed research approach. We will conclude with recommendations based on findings for future practitioners and researchers.

Keywords: artificial intelligence, capability, digital transformation, developing economies, healthcare

Procedia PDF Downloads 242
5365 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals

Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti

Abstract:

Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.

Keywords: neuroinformatics, bioinformatics, network tools, brain mapping

Procedia PDF Downloads 182
5364 Investigation of Interlayer Shear Effects in Asphalt Overlay on Existing Rigid Airfield Pavement Using Digital Image Correlation

Authors: Yuechao Lei, Lei Zhang

Abstract:

The interface shear between asphalt overlay and existing rigid airport pavements occurs due to differences in the mechanical properties of materials subjected to aircraft loading. Interlayer contact influences the mechanical characteristics of the asphalt overlay directly. However, the effective interlayer relative displacement obtained accurately using existing displacement sensors of the loading apparatus remains challenging. This study aims to utilize digital image correlation technology to enhance the accuracy of interfacial contact parameters by obtaining effective interlayer relative displacements. Composite structure specimens were prepared, and fixtures for interlayer shear tests were designed and fabricated. Subsequently, a digital image recognition scheme for required markers was designed and optimized. Effective interlayer relative displacement values were obtained through image recognition and calculation of surface markers on specimens. Finite element simulations validated the mechanical response of composite specimens with interlayer shearing. Results indicated that an optimized marking approach using the wall mending agent for surface application and color coding enhanced the image recognition quality of marking points on the specimen surface. Further image extraction provided effective interlayer relative displacement values during interlayer shear, thereby improving the accuracy of interface contact parameters. For composite structure specimens utilizing Styrene-Butadiene-Styrene (SBS) modified asphalt as the tack coat, the corresponding maximum interlayer shear stress strength was 0.6 MPa, and fracture energy was 2917 J/m2. This research provides valuable insights for investigating the impact of interlayer contact in composite pavement structures on the mechanical characteristics of asphalt overlay.

Keywords: interlayer contact, effective relative displacement, digital image correlation technology, composite pavement structure, asphalt overlay

Procedia PDF Downloads 48
5363 A Tool to Measure the Usability Guidelines for Arab E-Government Websites

Authors: Omyma Alosaimi, Asma Alsumait

Abstract:

The website developer and designer should follow usability guidelines to provide a user-friendly interface. Using tools to measure usability, the evaluator can evaluate automatically hundreds of links within few minutes. It has the advantage of detecting some violations that only machines can detect. For that using usability evaluating tool is important to find as many violations as possible. There are many websites usability testing tools, but none is developed to measure the usability of e-government website nor Arabic e-government websites. To measure the usability of the Arabic e-government websites, a tool is developed and tested in this paper. A comparison of using a tool specifically developed for e-government websites and general usability testing tool is presented.

Keywords: e-government, human computer interaction, usability evaluation, usability guidelines

Procedia PDF Downloads 423
5362 Intellectual Property Law as a Tool to Enhance and Sustain Museums in Digital Era

Authors: Nayira Ahmed Galal Elden Hassan, Amr Mostafa Awad Kassem

Abstract:

The management of Intellectual Property (IP) in museums presents a multifaceted challenge, requiring a balance between granting access to cultural assets and maintaining control over them. In the digital age, IP has emerged as a critical aspect of museum operations, encompassing valuable assets within collections and museum-generated content. Effective IP management enables museums to generate revenue, protect rights, and promote cultural heritage while leveraging digital technologies. Opportunities such as e-commerce and licensing can drive economic growth, but they also introduce complexities related to IP protection and regulation. This study explores the dual nature of IP assets—collection-based and museum-generated—highlighting their implications for sustainability and cultural preservation. The analysis includes examples such as the German State Museum’s management of replicas from the Nefertiti bust, showcasing the challenges museums face when navigating IP frameworks. The research underscores the importance of a comprehensive understanding of IP laws to prevent legal disputes, reputational risks, and revenue loss. By adopting an analytical and comparative methodology, this paper examines museums that have effectively implemented IP rules to enhance their operations and sustain their resources. It investigates how IP management can help museums fulfill their mission of community engagement, education, and outreach while ensuring long-term sustainability. The findings demonstrate that balanced IP strategies are essential for securing financial stability, safeguarding cultural heritage, and adapting to the demands of the digital era. This research seeks to explore how museums can effectively fulfill their mission of community engagement, education, and outreach while ensuring long-term sustainability. It examines the extent to which intellectual property (IP) management can contribute to achieving these objectives, focusing on the benefits and challenges associated with adopting IP management strategies. Additionally, the study addresses the question of ownership by investigating who holds the rights to cultural assets and how these rights can be managed effectively to align with both institutional goals and the preservation of cultural heritage.The findings underscore the pivotal role of effective IP management in empowering museums to navigate the digital landscape, maximize revenue streams, and safeguard cultural heritage. The study emphasizes the necessity of adopting a balanced approach to IP management, which aligns institutional goals with the ethical and legal considerations of cultural heritage preservation.

Keywords: intellectual property, museums, IP management, digital technologies, sustainability, cultural heritage

Procedia PDF Downloads 5