Search results for: Privacy Preserving Data Publication (PPDP)
25662 A Literature Review on ISO 10014
Authors: Rafael Feldmann Farias, Fernando Tobal Berssaneti
Abstract:
Since its emergence in 1998, ISO 10014 has been developed as a response to the need to demonstrate the economic and financial benefits that an organization can obtain from the implementation of a quality management system. With the publication of the new edition in 2021, this article aims to identify how this standard has been addressed through a literature review. Among the results, it was found that, of the 282 documents identified, only 0.7% of the publications used the standard and 1.4% of the publications cited it. This low adherence seems to be linked to the highly technical nature of the content of the standard.Keywords: quality management system, ISO 10014, economical benefits, financial benefits
Procedia PDF Downloads 11725661 Radio Based Location Detection
Authors: M. Pallikonda Rajasekaran, J. Joshapath, Abhishek Prasad Shaw
Abstract:
Various techniques has been employed to find location such as GPS, GLONASS, Galileo, and Beidou (compass). This paper currently deals with finding location using the existing FM signals that operates between 88-108 MHz. The location can be determined based on the received signal strength of nearby existing FM stations by mapping the signal strength values using trilateration concept. Thus providing security to users data and maintains eco-friendly environment at zero installation cost as this technology already existing FM stations operating in commercial FM band 88-108 MHZ. Along with the signal strength based trilateration it also finds azimuthal angle of the transmitter by employing directional antenna like Yagi-Uda antenna at the receiver side.Keywords: location, existing FM signals, received signal strength, trilateration, security, eco-friendly, direction, privacy, zero installation cost
Procedia PDF Downloads 52525660 Abnormality Detection of Persons Living Alone Using Daily Life Patterns Obtained from Sensors
Authors: Ippei Kamihira, Takashi Nakajima, Taiyo Matsumura, Hikaru Miura, Takashi Ono
Abstract:
In this research, the goal was construction of a system by which multiple sensors were used to observe the daily life behavior of persons living alone (while respecting their privacy). Using this information to judge such conditions as a bad physical condition or falling in the home, etc., so that these abnormal conditions can be made known to relatives and third parties. The daily life patterns of persons living alone are expressed by the number of responses of sensors each time that a set time period has elapsed. By comparing data for the prior two weeks, it was possible to judge a situation as 'normal' when the person was in a good physical condition or as 'abnormal' when the person was in a bad physical condition.Keywords: sensors, elderly living alone, abnormality detection, iifestyle habit
Procedia PDF Downloads 25825659 Bibliometric Analysis of the Research Progress on Graphene Inks from 2008 to 2018
Authors: Jean C. A. Sousa, Julio Cesar Maciel Santos, Andressa J. Rubio, Edneia A. S. Paccola, Natália U. Yamaguchi
Abstract:
A bibliometric analysis in the Web of Science database was used to identify overall scientific results of graphene inks to date (2008 to 2018). The objective of this study was to evaluate the evolutionary tendency of graphene inks research and to identify its aspects, aiming to provide data that can guide future work. The contributions of different researches, languages, thematic categories, periodicals, place of publication, institutes, funding agencies, articles cited and applications were analyzed. The results revealed a growing number of annual publications, of 258 papers found, 107 were included because they met the inclusion criteria. Three main applications were identified: synthesis and characterization, electronics and surfaces. The most relevant research on graphene inks has been summarized in this article, and graphene inks for electronic devices presented the most incident theme according to the research trends during the studied period. It is estimated that this theme will remain in evidence and will contribute to the direction of future research in this area.Keywords: bibliometric, coating, nanomaterials, scientometrics
Procedia PDF Downloads 17325658 Provenance in Scholarly Publications: Introducing the provCite Ontology
Authors: Maria Joseph Israel, Ahmed Amer
Abstract:
Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation
Procedia PDF Downloads 12125657 Social Mentoring: Towards Formal and Informal Deployment in the Structures of the Social and Solidarity Economy
Authors: Vanessa Casadella, Mourad Chouki, Agnès Ceccarelli, Sofiane Tahi
Abstract:
Mentoring is positioned in an interpersonal and intergenerational perspective, serving the transmission of interpersonal skills and organizational culture. It echoes orientation, project, self-actualization, guidance, transmission, and filiation. It is available using a formal or informal approach. The formal dimension refers to a privileged relationship between a senior and a junior. Informal mentoring is unplanned and emerges naturally between two people who choose each other. However, it remains more difficult to understand. To study the link between formal and informal mentoring and to define the notion of “social” mentoring, we conducted a qualitative study of an exploratory nature with around ten SSE organizations located in the southeast region of Tunisia. The wealth of this territory has pushed residents to found SSE organizations with a view to creating jobs but also to preserving traditions and preserving nature. These organizations developed spontaneously to solve various local problems, such as the revitalization of deserted rural areas, environmental degradation, and the reskilling and professional reintegration of people marginalized in the labor market. This research, based on semi-structured interviews in order to obtain exhaustive and sensitive data, involves an interview guide with few questions mobilized to let the respondents, leaders of the different structures, express themselves freely. The guide includes questions on activities, methods of sharing knowledge, and difficulties in understanding between stakeholders. The interviews, lasting 30 to 60 minutes, were recorded using a dictaphone and then transcribed in full. The results are as follows: 1. We see two iterative mentoring loops. A first loop can be considered a type of formal mentoring. It highlights the support organized (in the form of training) by social enterprises with the aim of developing the autonomy, know-how, and interpersonal skills of members. A second loop concerns informal mentoring. This is non-formalized support provided by members or with other members of the entourage. This informal mentoring is mainly based on the observation of good practices and learning by doing. 2. We notice an intersection between the two loops. If the first loop is not done, the second will not take place. The knowledge acquired in the first loop is used to feed the second. 3. We note a form of reluctance on the part of some members to share their knowledge for reasons of competition. Ultimately, we retain the notion of “social” mentoring as a hybridization of formal and informal mentoring while dimensioning the “social” perspective by emphasizing the reciprocal character, solidarity, confidence, and trust between the mentor and the mentee.Keywords: social innovation, social mentoring, social and solidarity economy, informal mentoring
Procedia PDF Downloads 5725656 Secure E-Pay System Using Steganography and Visual Cryptography
Authors: K. Suganya Devi, P. Srinivasan, M. P. Vaishnave, G. Arutperumjothi
Abstract:
Today’s internet world is highly prone to various online attacks, of which the most harmful attack is phishing. The attackers host the fake websites which are very similar and look alike. We propose an image based authentication using steganography and visual cryptography to prevent phishing. This paper presents a secure steganographic technique for true color (RGB) images and uses Discrete Cosine Transform to compress the images. The proposed method hides the secret data inside the cover image. The use of visual cryptography is to preserve the privacy of an image by decomposing the original image into two shares. Original image can be identified only when both qualified shares are simultaneously available. Individual share does not reveal the identity of the original image. Thus, the existence of the secret message is hard to be detected by the RS steganalysis.Keywords: image security, random LSB, steganography, visual cryptography
Procedia PDF Downloads 33325655 Artificial Intelligence in Ethiopian Higher Education: The Impact of Digital Readiness Support, Acceptance, Risk, and Trust on Adoption
Authors: Merih Welay Welesilassie
Abstract:
Understanding educators' readiness to incorporate AI tools into their teaching methods requires comprehensively examining the influencing factors. This understanding is crucial, given the potential of these technologies to personalise learning experiences, improve instructional effectiveness, and foster innovative pedagogical approaches. This study evaluated factors affecting teachers' adoption of AI tools in their English language instruction by extending the Technology Acceptance Model (TAM) to encompass digital readiness support, perceived risk, and trust. A cross-sectional quantitative survey was conducted with 128 English language teachers, supplemented by qualitative data collection from 15 English teachers. The structural mode analysis indicated that implementing AI tools in Ethiopian higher education was notably influenced by digital readiness support, perceived ease of use, perceived usefulness, perceived risk, and trust. Digital readiness support positively impacted perceived ease of use, usefulness, and trust while reducing safety and privacy risks. Perceived ease of use positively correlated with perceived usefulness but negatively influenced trust. Furthermore, perceived usefulness strengthened trust in AI tools, while perceived safety and privacy risks significantly undermined trust. Trust was crucial in increasing educators' willingness to adopt AI technologies. The qualitative analysis revealed that the teachers exhibited strong content and pedagogical knowledge but needed more technology-related knowledge. Moreover, It was found that the teachers did not utilise digital tools to teach English. The study identified several obstacles to incorporating digital tools into English lessons, such as insufficient digital infrastructure, a shortage of educational resources, inadequate professional development opportunities, and challenging policies and governance. The findings provide valuable guidance for educators, inform policymakers about creating supportive digital environments, and offer a foundation for further investigation into technology adoption in educational settings in Ethiopia and similar contexts.Keywords: digital readiness support, AI acceptance, perceived risc, AI trust
Procedia PDF Downloads 2725654 An Exploration of Gender Differences in Academic Writing in Science
Authors: Gayani Ranawake, Kate Wilson
Abstract:
Underrepresentation of women in academia, particularly in science, has been discussed by many scholars for decades. The causes of this underrepresentation are debated to this day. Publication is an important aspect of success in academia, and publication and citation rates are significant metrics in performance review, promotion, and employment. It has been established that men’s and women’s language use in general, both spoken and written, is different. However, no one, to our knowledge, has looked at whether men’s and women’s writing in science is different. If there are significant differences in the writing of men and women, then these differences may affect women’s ability to succeed in science. This study is part of a larger project to explore whether differences can be recognized in the academic science writing of men and women. Mono authored articles from high ranking physics, biology and psychology journals by men and women authors were compared in terms of readability statistics. In particular, the abstract and introduction sections were compared, as these are the first sections encountered by a reviewer, and so may have an important effect on their impression of the work. The Flesch Reading Ease, the percentage of passive sentences and the Flesch-Kincaid Reading Grade Level were calculated for each section of each article, along with counts of numbers of sentences, words per sentence and sentences per paragraph. Significance of differences was tested using the Behrens statistic. It was found that for both physics and biology papers there were no significant differences in the complexity or verbosity of the writing of men and women authors. However, there was a significant difference between the two disciplines, with physics articles being generally more readable (higher readability score) while also more passive (higher number of passive sentences). In contrast, the psychology articles showed a difference between men and women authors which may be significant. The average readability for introductions in women’s articles was 28 which was higher than for men’s articles, which was 19 (higher values indicate more readable). Women’s articles in psychology also had a greater proportion of passive sentences. It can be concluded that, at least in the more traditional sciences, men and women have adopted similar ways of writing, and that disciplinary differences are greater than gender differences. This may not be the case in psychology, which many consider to be more closely aligned with the humanities. Whether the lack of differences is because women have adapted to a masculine way of writing, or whether the genre itself is gender neutral needs further investigation.Keywords: academic writing, gender differences, readability, science
Procedia PDF Downloads 19925653 History of Pediatric Renal Pathology
Authors: Mostafa Elbaba
Abstract:
Because childhood renal diseases are grossly different compared to adult diseases, pediatric nephrology was founded as a specialty in 1965. Renal pathology specialty was introduced at the London Ciba Symposium in 1961. The history of renal pathology can be divided into two eras: one starting in the 1650s with the invention of the microscope, the second in the 1950s with the implementation of renal biopsy, and the presence of electron microscopy and immunofluorescence study. Prior to the 1950s, the study of diseased human kidneys was restricted to postmortem examination by gross pathology. In 1827, Richard Bright first described his triad of kidney disease, which was confirmed by morbid kidney changes at autopsy. In 1905 Friedrich Mueller coined the term “nephrosis” describing the inflammatory form of “degenerative” diseases, and later F. Munk added the term “lipoid nephrosis”. The most profound influence on renal diseases’ classification came from the publication of Volhard and Fahr in 1914. In 1899, Carl Max Wilhelm Wilms described Wilms' tumor of the kidneys in children. Chronic pyelonephritis was a popular renal diagnosis and the most common cause of uremia until the 1960s. Although kidney biopsy had been used early in the 1930s for renal tumors, the earliest reports of its use in the diagnosis of medical kidney disease were by Iversen and Brun in 1951, followed by Alwall in 1952, then by Pardo in 1953. The earliest intentional renal biopsies were done in 1944 by Nils Alwall, while the procedure was abandoned after the death of one of his 13 patients who biopsied. In 1950, Antonino Perez-Ara attempted renal biopsies, but his results were missed because of an unpopular journal publication. In the year 1951, Claus Brun and Poul Iverson developed the biopsy procedure using an aspiration technique. Popularizing renal biopsy practice is accredited to Robert Kark, who published his distinct work in 1954. He perfected the technique of renal biopsy in the prone position using the Vim-Silverman needle and used intravenous pyelography to improve the localization of the kidney.Keywords: history, medicine, nephrology, pediatrics, pathology
Procedia PDF Downloads 6225652 Stress Concentration Trend for Combined Loading Conditions
Authors: Aderet M. Pantierer, Shmuel Pantierer, Raphael Cordina, Yougashwar Budhoo
Abstract:
Stress concentration occurs when there is an abrupt change in geometry, a mechanical part under loading. These changes in geometry can include holes, notches, or cracks within the component. The modifications create larger stress within the part. This maximum stress is difficult to determine, as it is directly at the point of the minimum area. Strain gauges have yet to be developed to analyze stresses at such minute areas. Therefore, a stress concentration factor must be utilized. The stress concentration factor is a dimensionless parameter calculated solely on the geometry of a part. The factor is multiplied by the nominal, or average, stress of the component, which can be found analytically or experimentally. Stress concentration graphs exist for common loading conditions and geometrical configurations to aid in the determination of the maximum stress a part can withstand. These graphs were developed from historical data yielded from experimentation. This project seeks to verify a stress concentration graph for combined loading conditions. The aforementioned graph was developed using CATIA Finite Element Analysis software. The results of this analysis will be validated through further testing. The 3D modeled parts will be subjected to further finite element analysis using Patran-Nastran software. The finite element models will then be verified by testing physical specimen using a tensile testing machine. Once the data is validated, the unique stress concentration graph will be submitted for publication so it can aid engineers in future projects.Keywords: stress concentration, finite element analysis, finite element models, combined loading
Procedia PDF Downloads 45025651 Artificial Intelligence in Ethiopian Universities: The Influence of Technological Readiness, Acceptance, Perceived Risk, and Trust on Implementation - An Integrative Research Approach
Authors: Merih Welay Welesilassie
Abstract:
Understanding educators' readiness to incorporate AI tools into their teaching methods requires comprehensively examining the influencing factors. This understanding is crucial, given the potential of these technologies to personalise learning experiences, improve instructional effectiveness, and foster innovative pedagogical approaches. This study evaluated factors affecting teachers' adoption of AI tools in their English language instruction by extending the Technology Acceptance Model (TAM) to encompass digital readiness support, perceived risk, and trust. A cross-sectional quantitative survey was conducted with 128 English language teachers, supplemented by qualitative data collection from 15 English teachers. The structural mode analysis indicated that implementing AI tools in Ethiopian higher education was notably influenced by digital readiness support, perceived ease of use, perceived usefulness, perceived risk, and trust. Digital readiness support positively impacted perceived ease of use, usefulness, and trust while reducing safety and privacy risks. Perceived ease of use positively correlated with perceived usefulness but negatively influenced trust. Furthermore, perceived usefulness strengthened trust in AI tools, while perceived safety and privacy risks significantly undermined trust. Trust was crucial in increasing educators' willingness to adopt AI technologies. The qualitative analysis revealed that the teachers exhibited strong content and pedagogical knowledge but needed more technology-related knowledge. Moreover, It was found that the teachers did not utilise digital tools to teach English. The study identified several obstacles to incorporating digital tools into English lessons, such as insufficient digital infrastructure, a shortage of educational resources, inadequate professional development opportunities, and challenging policies and governance. The findings provide valuable guidance for educators, inform policymakers about creating supportive digital environments, and offer a foundation for further investigation into technology adoption in educational settings in Ethiopia and similar contexts.Keywords: digital readiness support, AI acceptance, risk, trust
Procedia PDF Downloads 2425650 Formal Development of Electronic Identity Card System Using Event-B
Authors: Tomokazu Nagata, Jawid Ahmad Baktash
Abstract:
The goal of this paper is to explore the use of formal methods for Electronic Identity Card System. Nowadays, one of the core research directions in a constantly growing distributed environment is the improvement of the communication process. The responsibility for proper verification becomes crucial. Formal methods can play an essential role in the development and testing of systems. The thesis presents two different methodologies for assessing correctness. Our first approach employs abstract interpretation techniques for creating a trace based model for Electronic Identity Card System. The model was used for building a semi decidable procedure for verifying the system model. We also developed the code for the eID System and can cover three parts login to system sending of Acknowledgment from user side, receiving of all information from server side and log out from system. The new concepts of impasse and spawned sessions that we introduced led our research to original statements about the intruder’s knowledge and eID system coding with respect to secrecy. Furthermore, we demonstrated that there is a bound on the number of sessions needed for the analysis of System.Electronic identity (eID) cards promise to supply a universal, nation-wide mechanism for user authentication. Most European countries have started to deploy eID for government and private sector applications. Are government-issued electronic ID cards the proper way to authenticate users of online services? We use the eID project as a showcase to discuss eID from an application perspective. The new eID card has interesting design features, it is contact-less, it aims to protect people’s privacy to the extent possible, and it supports cryptographically strong mutual authentication between users and services. Privacy features include support for pseudonymous authentication and per service controlled access to individual data items. The article discusses key concepts, the eID infrastructure, observed and expected problems, and open questions. The core technology seems ready for prime time and government projects deploy it to the masses. But application issues may hamper eID adoption for online applications.Keywords: eID, event-B, Pro-B, formal method, message passing
Procedia PDF Downloads 23825649 A Lightweight Blockchain: Enhancing Internet of Things Driven Smart Buildings Scalability and Access Control Using Intelligent Direct Acyclic Graph Architecture and Smart Contracts
Authors: Syed Irfan Raza Naqvi, Zheng Jiangbin, Ahmad Moshin, Pervez Akhter
Abstract:
Currently, the IoT system depends on a centralized client-servant architecture that causes various scalability and privacy vulnerabilities. Distributed ledger technology (DLT) introduces a set of opportunities for the IoT, which leads to practical ideas for existing components at all levels of existing architectures. Blockchain Technology (BCT) appears to be one approach to solving several IoT problems, like Bitcoin (BTC) and Ethereum, which offer multiple possibilities. Besides, IoTs are resource-constrained devices with insufficient capacity and computational overhead to process blockchain consensus mechanisms; the traditional BCT existing challenge for IoTs is poor scalability, energy efficiency, and transaction fees. IOTA is a distributed ledger based on Direct Acyclic Graph (DAG) that ensures M2M micro-transactions are free of charge. IOTA has the potential to address existing IoT-related difficulties such as infrastructure scalability, privacy and access control mechanisms. We proposed an architecture, SLDBI: A Scalable, lightweight DAG-based Blockchain Design for Intelligent IoT Systems, which adapts the DAG base Tangle and implements a lightweight message data model to address the IoT limitations. It enables the smooth integration of new IoT devices into a variety of apps. SLDBI enables comprehensive access control, energy efficiency, and scalability in IoT ecosystems by utilizing the Masked Authentication Message (MAM) protocol and the IOTA Smart Contract Protocol (ISCP). Furthermore, we suggest proof-of-work (PoW) computation on the full node in an energy-efficient way. Experiments have been carried out to show the capability of a tangle to achieve better scalability while maintaining energy efficiency. The findings show user access control management at granularity levels and ensure scale up to massive networks with thousands of IoT nodes, such as Smart Connected Buildings (SCBDs).Keywords: blockchain, IOT, direct acyclic graphy, scalability, access control, architecture, smart contract, smart connected buildings
Procedia PDF Downloads 12725648 Forensic Analysis of Signal Messenger on Android
Authors: Ward Bakker, Shadi Alhakimi
Abstract:
The amount of people moving towards more privacy focused instant messaging applications has grown significantly. Signal is one of these instant messaging applications, which makes Signal interesting for digital investigators. In this research, we evaluate the artifacts that are generated by the Signal messenger for Android. This evaluation was done by using the features that Signal provides to create artifacts, whereafter, we made an image of the internal storage and the process memory. This image was analysed manually. The manual analysis revealed the content that Signal stores in different locations during its operation. From our research, we were able to identify the artifacts and interpret how they were used. We also examined the source code of Signal. Using our obtain knowledge from the source code, we developed a tool that decrypts some of the artifacts using the key stored in the Android Keystore. In general, we found that most artifacts are encrypted and encoded, even after decrypting some of the artifacts. During data visualization, some artifacts were found, such as that Signal does not use relationships between the data. In this research, two interesting groups of artifacts were identified, those related to the database and those stored in the process memory dump. In the database, we found plaintext private- and group chats, and in the memory dump, we were able to retrieve the plaintext access code to the application. Nevertheless, we conclude that Signal contains a wealth of artifacts that could be very valuable to a digital forensic investigation.Keywords: forensic, signal, Android, digital
Procedia PDF Downloads 8625647 The Synopsis of the AI-Powered Therapy Web Platform ‘Free AI Therapist'
Authors: Arwa Alnowaiser, Hala Shoukri
Abstract:
The ‘FreeAITherapist’ is an artificial intelligence application that uses the power of AI to offer advice and mental health counseling to its users through its chatbot services. The AI therapist is designed to understand users' issues, concerns, and problems and respond appropriately; it provides empathy and guidance and uses evidence-based therapeutic techniques. With its user-friendly platform, it ensures accessibility for individuals in need, regardless of their geographical location. This website was created in direct response to the growing demand for mental health support, aiming to provide a cost-effective and confidential solution. Through promising confidentiality, it considers user privacy and data security. The ‘FreeAITherapist’ strives to bridge the gap in mental health services, offering a reliable resource for individuals seeking guidance and counseling to improve their overall well-being.Keywords: artificial intelligence, mental health, AI therapist, website, counseling
Procedia PDF Downloads 5025646 Ecological and Historical Components of the Cultural Code of the City of Florence as Part of the Edutainment Project Velonotte International
Authors: Natalia Zhabo, Sergey Nikitin, Marina Avdonina, Mariya Nikitina
Abstract:
The analysis of the activities of one of the events of the international educational and entertainment project Velonotte is provided: an evening bicycle tour with children around Florence. The aim of the project is to develop methods and techniques for increasing the sensitivity of the cycling participants and listeners of the radio broadcasts to the treasures of the national heritage, in this case, to the historical layers of the city and the ecology of the Renaissance epoch. The block of educational tasks is considered, and the issues of preserving the identity of the city are discussed. Methods. The Florentine event was prepared during more than a year. First of all the creative team selected such events of the history of the city which seem to be important for revealing the specifics of the city, its spirit - from antiquity to our days – including the forums of Internet with broad public opinion. Then a route (seven kilometers) was developed, which was proposed to the authorities and organizations of the city. The selection of speakers was conducted according to several criteria: they should be authors of books, famous scientists, connoisseurs in a certain sphere (toponymy, history of urban gardens, art history), capable and willing to talk with participants directly at the points of stops, in order to make a dialogue and so that performances could be organized with their participation. The music was chosen for each part of the itinerary to prepare the audience emotionally. Cards for coloring with images of the main content of each stop were created for children. A site was done to inform the participants and to keep photos, videos and the audio files with speakers’ speech afterward. Results: Held in April 2017, the event was dedicated to the 640th Anniversary of the Filippo Brunelleschi, Florentine architect, and to the 190th anniversary of the publication of Florence guide by Stendhal. It was supported by City of Florence and Florence Bike Festival. Florence was explored to transfer traditional elements of culture, sometimes unfairly forgotten from ancient times to Brunelleschi and Michelangelo and Tschaikovsky and David Bowie with lectures by professors of Universities. Memorable art boards were installed in public spaces. Elements of the cultural code are deeply internalized in the minds of the townspeople, the perception of the city in everyday life and human communication is comparable to such fundamental concepts of the self-awareness of the townspeople as mental comfort and the level of happiness. The format of a fun and playful walk with the ICT support gives new opportunities for enriching the city's cultural code of each citizen with new components, associations, connotations.Keywords: edutainment, cultural code, cycling, sensitization Florence
Procedia PDF Downloads 22525645 Data Mining in Healthcare for Predictive Analytics
Authors: Ruzanna Muradyan
Abstract:
Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health
Procedia PDF Downloads 6625644 PAPR Reduction of FBMC Using Sliding Window Tone Reservation Active Constellation Extension Technique
Authors: S. Anuradha, V. Sandeep Kumar
Abstract:
The high Peak to Average Power Ratio (PAR) in Filter Bank Multicarrier with Offset Quadrature Amplitude Modulation (FBMC-OQAM) can significantly reduce power efficiency and performance. In this paper, we address the problem of PAPR reduction for FBMC-OQAM systems using Tone Reservation (TR) technique. Due to the overlapping structure of FBMCOQAM signals, directly applying TR schemes of OFDM systems to FBMC-OQAM systems is not effective. We improve the tone reservation (TR) technique by employing sliding window with Active Constellation Extension for the PAPR reduction of FBMC-OQAM signals, called sliding window tone reservation Active Constellation Extension (SW-TRACE) technique. The proposed SW-TRACE technique uses the peak reduction tones (PRTs) of several consecutive data blocks to cancel the peaks of the FBMC-OQAM signal inside a window, with dynamically extending outer constellation points in active(data-carrying) channels, within margin-preserving constraints, in order to minimize the peak magnitude. Analysis and simulation results compared to the existing Tone Reservation (TR) technique for FBMC/OQAM system. The proposed method SW-TRACE has better PAPR performance and lower computational complexity.Keywords: FBMC-OQAM, peak-to-average power ratio, sliding window, tone reservation Active Constellation Extension
Procedia PDF Downloads 44925643 Improving Cheon-Kim-Kim-Song (CKKS) Performance with Vector Computation and GPU Acceleration
Authors: Smaran Manchala
Abstract:
Homomorphic Encryption (HE) enables computations on encrypted data without requiring decryption, mitigating data vulnerability during processing. Usable Fully Homomorphic Encryption (FHE) could revolutionize secure data operations across cloud computing, AI training, and healthcare, providing both privacy and functionality, however, the computational inefficiency of schemes like Cheon-Kim-Kim-Song (CKKS) hinders their widespread practical use. This study focuses on optimizing CKKS for faster matrix operations through the implementation of vector computation parallelization and GPU acceleration. The variable effects of vector parallelization on GPUs were explored, recognizing that while parallelization typically accelerates operations, it could introduce overhead that results in slower runtimes, especially in smaller, less computationally demanding operations. To assess performance, two neural network models, MLPN and CNN—were tested on the MNIST dataset using both ARM and x86-64 architectures, with CNN chosen for its higher computational demands. Each test was repeated 1,000 times, and outliers were removed via Z-score analysis to measure the effect of vector parallelization on CKKS performance. Model accuracy was also evaluated under CKKS encryption to ensure optimizations did not compromise results. According to the results of the trail runs, applying vector parallelization had a 2.63X efficiency increase overall with a 1.83X performance increase for x86-64 over ARM architecture. Overall, these results suggest that the application of vector parallelization in tandem with GPU acceleration significantly improves the efficiency of CKKS even while accounting for vector parallelization overhead, providing impact in future zero trust operations.Keywords: CKKS scheme, runtime efficiency, fully homomorphic encryption (FHE), GPU acceleration, vector parallelization
Procedia PDF Downloads 3325642 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks
Authors: Sulemana Ibrahim
Abstract:
Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks
Procedia PDF Downloads 6725641 CIPP Evaluation of Online Broadcasting of Suan Dusit Rajabhat University
Authors: Somkiat Korbuakaew, Winai Mankhatitham, Anchan Chongcharoen, Wichar Kunkum
Abstract:
This research’s objective is to evaluate the online broadcasting of Suan Dusit Rajabhat Univeristy by CIPP model. The evaluation was separated into 4 parts: context factor, input factor, process factor and product factor. Sample group in this research were 399 participants who were university’s executive, staff and students. Questionnaires and interview were the research tools. Data were analyzed by computer program. Statistics used here were percentage, mean, and standard deviation. Findings are as follows: 1. Context factor: The context factor here in this research was university’s executives, staff and students. The study shows that they would like to use online broadcasting to be the educational tool and IT development. 2. Input factor: The input factor was the modern IT equipment to create interesting teaching materials and develop education in general. 3. Process factor: The process factor in this study was the publication of the program that it should be promoted more among students and should be more objective. 4. Product factor: The product factor in this study was the purpose of the program that it expands the educational channel for students.Keywords: evaluation, project, internet, online broadcasting
Procedia PDF Downloads 53325640 Processing Big Data: An Approach Using Feature Selection
Authors: Nikat Parveen, M. Ananthi
Abstract:
Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.Keywords: big data, key value, feature selection, retrieval, performance
Procedia PDF Downloads 34325639 Preparedness for Microbial Forensics Evidence Collection on Best Practice
Authors: Victor Ananth Paramananth, Rashid Muniginin, Mahaya Abd Rahman, Siti Afifah Ismail
Abstract:
Safety issues, scene protection, and appropriate evidence collection must be handled in any bio crime scene. There will be a scene or multi-scene to be cordoned for investigation in any bio-incident or bio crime event. Evidence collection is critical in determining the type of microbial or toxin, its lethality, and its source. As a consequence, from the start of the investigation, a proper sampling method is required. The most significant challenges for the crime scene officer would be deciding where to obtain samples, the best sampling method, and the sample sizes needed. Since there could be evidence in liquid, viscous, or powder shape at a crime scene, crime scene officers have difficulty determining which tools to use for sampling. To maximize sample collection, the appropriate tools for sampling methods are necessary. This study aims to assist the crime scene officer in collecting liquid, viscous, and powder biological samples in sufficient quantity while preserving sample quality. Observational tests on sample collection using liquid, viscous, and powder samples for adequate quantity and sample quality were performed using UV light in this research. The density of the light emission varies upon the method of collection and sample types. The best tools for collecting sufficient amounts of liquid, viscous, and powdered samples can be identified by observing UV light. Instead of active microorganisms, the invisible powder is used to assess sufficient sample collection during a crime scene investigation using various collection tools. The liquid, powdered and viscous samples collected using different tools were analyzed using Fourier transform infrared - attenuate total reflection (FTIR-ATR). FTIR spectroscopy is commonly used for rapid discrimination, classification, and identification of intact microbial cells. The liquid, viscous and powdered samples collected using various tools have been successfully observed using UV light. Furthermore, FTIR-ATR analysis showed that collected samples are sufficient in quantity while preserving their quality.Keywords: biological sample, crime scene, collection tool, UV light, forensic
Procedia PDF Downloads 20025638 Sharing Personal Information for Connection: The Effect of Social Exclusion on Consumer Self-Disclosure to Brands
Authors: Jiyoung Lee, Andrew D. Gershoff, Jerry Jisang Han
Abstract:
Most extant research on consumer privacy concerns and their willingness to share personal data has focused on contextual factors (e.g., types of information collected, type of compensation) that lead to consumers’ personal information disclosure. Unfortunately, the literature lacks a clear understanding of how consumers’ incidental psychological needs may influence consumers’ decisions to share their personal information with companies or brands. In this research, we investigate how social exclusion, which is an increasing societal problem, especially since the onset of the COVID-19 pandemic, leads to increased information disclosure intentions for consumers. Specifically, we propose and find that when consumers become socially excluded, their desire for social connection increases, and this desire leads to a greater willingness to disclose their personal information with firms. The motivation to form and maintain interpersonal relationships is one of the most fundamental human needs, and many researchers have found that deprivation of belongingness has negative consequences. Given the negative effects of social exclusion and the universal need to affiliate with others, people respond to exclusion with a motivation for social reconnection, resulting in various cognitive and behavioral consequences, such as paying greater attention to social cues and conforming to others. Here, we propose personal information disclosure as another form of behavior that can satisfy such social connection needs. As self-disclosure can serve as a strategic tool in creating and developing social relationships, those who have been socially excluded and thus have greater social connection desires may be more willing to engage in self-disclosure behavior to satisfy such needs. We conducted four experiments to test how feelings of social exclusion can influence the extent to which consumers share their personal information with brands. Various manipulations and measures were used to demonstrate the robustness of our effects. Through the four studies, we confirmed that (1) consumers who have been socially excluded show greater willingness to share their personal information with brands and that (2) such an effect is driven by the excluded individuals’ desire for social connection. Our findings shed light on how the desire for social connection arising from exclusion influences consumers’ decisions to disclose their personal information to brands. We contribute to the consumer disclosure literature by uncovering a psychological need that influences consumers’ disclosure behavior. We also extend the social exclusion literature by demonstrating that exclusion influences not only consumers’ choice of products but also their decision to disclose personal information to brands.Keywords: consumer-brand relationship, consumer information disclosure, consumer privacy, social exclusion
Procedia PDF Downloads 13225637 Econometric Analysis of Organic Vegetable Production in Turkey
Authors: Ersin Karakaya, Halit Tutar
Abstract:
Reliable foods must be consumed in terms of healthy nutrition. The production and dissemination of diatom products in Turkey is rapidly evolving on the basis of preserving ecological balance, ensuring sustainability in agriculture and offering quality, reliable products to consumers. In this study, year in Turkey as (2002- 2015) to determine values of such as cultivated land of organic vegetable production, production levels, production quantity, number of products, number of farmers. It is intended to make the econometric analysis of the factors affecting the production of organic vegetable production (Number of products, Number of farmers and cultivated land). The main material of the study has created secondary data in relation to the 2002-2015 period as organic vegetable production in Turkey and regression analysis of the factors affecting the value of production of organic vegetable is determined by the Least Squares Method with EViews statistical software package.Keywords: number of farmers, cultivated land, Eviews, Turkey
Procedia PDF Downloads 31225636 The Technophobia among Older Adults in China
Authors: Erhong Sun, Xuchun Ye
Abstract:
Technophobia, namely the fear or dislike of modern advanced technologies, plays a central role in age-related digital divides and is considered a new risk factor for older adults, which can affect the daily lives of people through low adherence to digital living. Indeed, there is considerable heterogeneity in the group of older adults who feel technophobia. Therefore, the aim of this study was to identify different technophobia typologies of older people and to examine their associations with the subjective age factor. A sample of 704 retired elderly over the age of 55 was recruited in China. Technophobia and subjective age were assessed with a questionnaire, respectively. Latent profile analysis was used to identify technophobia subgroups, using three dimensions including techno-anxiety, techno-paranoia, and privacy concerns as indicators. The association between the identified technophobia subgroups and subjective age was explored. In summary, four different technophobia typologies were identified among older adults in China. Combined with an investigation of personal background characteristics and subjective age, it draws a more nuanced image of the technophobia phenome among older adults in China. First, not all older adults suffer from technophobia, with about half of the elderly subjects belonging to the profiles of “Low-technophobia” and “Medium-technophobia.” Second, privacy concern plays an important role in the classification of technophobia among older adults. Third, subjective age might be a protective factor for technophobia in older adults. Although the causal direction between identified technophobia typologies and subjective age remains uncertain, our suggests that future interventions should better focus on subjective age by breaking the age stereotype of technology to reduce the negative effect of technophobia on older. Future development of this research will involve extensive investigation of the detailed impact of technophobia in senior populations, measurement of the negative outcomes, as well as formulation of innovative educational and clinical pathways.Keywords: technophobia, older adults, latent profile analysis, subjective age
Procedia PDF Downloads 8025635 Digital Preservation in Nigeria Universities Libraries: A Comparison between University of Nigeria Nsukka and Ahmadu Bello University Zaria
Authors: Suleiman Musa, Shuaibu Sidi Safiyanu
Abstract:
This study examined the digital preservation in Nigeria university libraries. A comparison between the university of Nigeria Nsukka (UNN) and Ahmadu Bello University Zaria (ABU, Zaria). The study utilized primary source of data obtained from two selected institution librarians. Finding revealed varying results in terms of skills acquired by librarians before and after digitization of the two institutions. The study reports that journals publication, text book, CD-ROMS, conference papers and proceedings, theses, dissertations and seminar papers are among the information resources available for digitization. The study further documents that copyright issue, power failure, and unavailability of needed materials are among the challenges facing the digitization of library of the institution. On the basis of the finding, the study concluded that digitization of library enhances efficiency in organization and retrieval of information services. The study therefore recommended that software should be upgraded with backup, training of the librarians on digital process, installation of antivirus and enhancement of technical collaboration between the library and MIS.Keywords: digitalization, preservation, libraries, comparison
Procedia PDF Downloads 34325634 Effectiveness of Psychosocial Interventions in Preventing Postpartum Depression among Teenage Mothers: Systematic Review and Meta-Analysis of Randomized Controlled Trials
Authors: Lebeza Alemu Tenaw, Fei Wan Ngai
Abstract:
Background: Postpartum depression is the most common mental health disorder that occurs after childbirth, and it is more prevalent among teenage mothers compared to adults. Although there is emerging evidence suggesting psychosocial interventions can decrease postpartum depression, there are no consistent findings regarding the effectiveness of these interventions, especially for teenage mothers. The current review aimed to investigate the effectiveness of psychosocial interventions in preventing postpartum depression among teenage mothers. Methods: The Preferred Reporting Items for Systematic Review and Meta-analysis (PRISMA) manual was implemented to select articles from online databases. The articles were searched using the Population, Intervention, Control, and Outcome (PICO) model. The quality of the articles was assessed using the Cochrane Collaboration Risk of Bias assessment tool. The statistical analyses were performed using Stata 17, and the effect size was estimated using the standard mean difference score of depression between the intervention and control groups. Heterogeneity between the studies was assessed through the I2 statistic and Q statistic, while the publication bias was evaluated using the asymmetry of the funnel plot and Egger's test. Results: In this systematic review, a total of nine articles were included. While psychosocial interventions demonstrated in reducing the risk of postpartum depression compared to usual maternal care, it is important to note that the mean difference score of depression was significant in only three of the included studies. The overall meta-analysis finding revealed that psychosocial interventions were effective in preventing postpartum depression, with a pooled effect size of -0.5 (95% CI: -0.95, -0.06) during the final time postpartum depression assessment. The heterogeneity level was found to be substantial, with an I2 value of 82.3%. However, no publication bias was observed. Conclusion: The review findings suggest that psychosocial interventions initiated during the late antenatal and early postnatal periods effectively prevent postpartum depression. The interventions were found to be more beneficial during the first three months of the postpartum period. However, this review also highlighted that there is a scarcity of interventional studies conducted in low-income countries, indicating the need for further studies in diverse communities.Keywords: teenage pregnancy, postpartum depression, review
Procedia PDF Downloads 5325633 Knowledge Representation Based on Interval Type-2 CFCM Clustering
Authors: Lee Myung-Won, Kwak Keun-Chang
Abstract:
This paper is concerned with knowledge representation and extraction of fuzzy if-then rules using Interval Type-2 Context-based Fuzzy C-Means clustering (IT2-CFCM) with the aid of fuzzy granulation. This proposed clustering algorithm is based on information granulation in the form of IT2 based Fuzzy C-Means (IT2-FCM) clustering and estimates the cluster centers by preserving the homogeneity between the clustered patterns from the IT2 contexts produced in the output space. Furthermore, we can obtain the automatic knowledge representation in the design of Radial Basis Function Networks (RBFN), Linguistic Model (LM), and Adaptive Neuro-Fuzzy Networks (ANFN) from the numerical input-output data pairs. We shall focus on a design of ANFN in this paper. The experimental results on an estimation problem of energy performance reveal that the proposed method showed a good knowledge representation and performance in comparison with the previous works.Keywords: IT2-FCM, IT2-CFCM, context-based fuzzy clustering, adaptive neuro-fuzzy network, knowledge representation
Procedia PDF Downloads 328