Search results for: symmetric half-plane digital filter
715 Designing of Content Management Systems (CMS) for Web Development
Authors: Abdul Basit Kiani, Maryam Kiani
Abstract:
Content Management Systems (CMS) have transformed the landscape of web development by providing an accessible and efficient platform for creating and managing digital content. This abstract explores the key features and benefits of CMS in web development, highlighting its impact on website creation and maintenance. CMS offers a user-friendly interface that empowers individuals to create, edit, and publish content without requiring extensive technical knowledge. With customizable templates and themes, users can personalize the design and layout of their websites, ensuring a visually appealing online presence. Furthermore, CMS facilitates efficient content organization through categorization and tagging, enabling visitors to navigate and search for information effortlessly. It also supports version control, allowing users to track and manage revisions effectively. Scalability is a notable advantage of CMS, as it offers a wide range of plugins and extensions to integrate additional features into websites. From e-commerce functionality to social media integration, CMS adapts to evolving business needs. Additionally, CMS enhances collaborative workflows by allowing multiple user roles and permissions. This enables teams to collaborate effectively on content creation and management, streamlining processes and ensuring smooth coordination. In conclusion, CMS serves as a powerful tool in web development, simplifying content creation, customization, organization, scalability, and collaboration. With CMS, individuals and businesses can create dynamic and engaging websites, establishing a strong online presence with ease.Keywords: web development, content management systems, information technology, programming
Procedia PDF Downloads 85714 Planning and Management Options for Pastoral Resource: Case of Mecheria Region, Algeria
Authors: Driss Haddouche
Abstract:
Pastoral crisis in Algeria has its origins in rangeland degradation which are the main factor in any activity in the steppe zones. Indeed, faced with the increasing human and animal population on a living space smaller and smaller, there is an overuse of what remains of the steppe range lands, consequently the not sustainability of biomass production. Knowing the amount of biomass available, the practice of grazing options, taking into account the use of "Use Factor" factor remains an essential method for managing pastoral resources. This factor has three options: at 40% Conservative pasture; at 60 % the beginning of overgrazing; at 80% destructive grazing. Accessibility on the pasture is based on our field observations of a type any flock along a grazing cycle. The main purpose of these observations is to highlight the speed of herd grazing situation. Several individuals from the herd were timed to arrive at an average duration of about 5 seconds to move between two tufts of grass, separated by a distance of one meter. This gives a rate of 5 s/m (0.72 km/h) flat. This speed varies depending on the angle of the slope. Knowing the speed and slope of each pixel of the study area, given by the digital elevation model of Spot Image (MNE) and whose pitch is 15 meters, a map of pasture according to the distances is generated. Knowing the stocking and biomass available, the examination of the common Mécheria at regular distances (8.64 km or 12 hours of grazing, 17.28 km or 24 hours of grazing and 25.92 Km or 36 hours of grazing), offers three different options (conservation grazing resource: utilization at 40%; overgrazing statements for use at 60% and grazing destructive for use by more than 80%) for each distance traveled by sheep from the starting point is the town of Mécheria.Keywords: pastoral crisis, biomass, animal charge, use factor, Algeria
Procedia PDF Downloads 531713 Subthalamic Nucleus in Adult Human Cadaveric Brain: A Morphometric Study
Authors: Mangala Kohli, P. A. Athira, Reeha Mahajan
Abstract:
The subthalamic nucleus (STN) is a biconvex nucleus situated in the diencephalon. The knowledge of the morphometry of the subthalamic nucleus is essential for accurate targeting of the nucleus during Deep Brain Stimulation. The present study aims to note the morphometry of the subthalamic nucleus in both the cerebral hemispheres which will prove to be of great value to radiologists and neurosurgeons. A cross‐sectional observational study was conducted in the Departments of Anatomy and Forensic Medicine, Lady Hardinge Medical College & Associated Hospitals, New Delhi on thirty adult cadaveric brain specimens of unclaimed and donated corpses. The specimens were categorized into 3 age groups: 20-35, 35-50 and above 50 years. All samples were collected after following the standard protocol for ethical clearance. The morphometric study of 60 subthalamic nucleus was thus conducted. Transverse section of the brain was made at a plane 4mm ventral to the plane containing mid commissural point. The dimensions of the subthalamic nucleus were measured bilaterally with the aid of digital Vernier caliper and magnifying glass. In the present study, the mean length and width and AC-PC length of the subthalamic nucleus was recorded on the right and left side in Group A, B and C. On comparison of mean of subthalamic nucleus dimensions between the right and left side in Group C, no statistically significant difference was observed. The length and width of subthalamic nucleus measured in the 3 age groups were compared with each other and the p value calculated. There was no statistically significant difference between the dimensions of Group A and B, Group B and C as well as Group A and C. The present study reveals that there is no significant reduction in the size of the nucleus was noted with increasing age. Thus, the values obtained in the present study can be used as a reference for various invasive and non-invasive procedures on subthalamic nucleus.Keywords: cerebral hemisphere, deep brain stimulation, morphometry, subthalamic nucleus
Procedia PDF Downloads 184712 The Direct Deconvolution Model for the Large Eddy Simulation of Turbulence
Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang
Abstract:
Large eddy simulation (LES) has been extensively used in the investigation of turbulence. LES calculates the grid-resolved large-scale motions and leaves small scales modeled by sublfilterscale (SFS) models. Among the existing SFS models, the deconvolution model has been used successfully in the LES of the engineering flows and geophysical flows. Despite the wide application of deconvolution models, the effects of subfilter scale dynamics and filter anisotropy on the accuracy of SFS modeling have not been investigated in depth. The results of LES are highly sensitive to the selection of filters and the anisotropy of the grid, which has been overlooked in previous research. In the current study, two critical aspects of LES are investigated. Firstly, we analyze the influence of sub-filter scale (SFS) dynamics on the accuracy of direct deconvolution models (DDM) at varying filter-to-grid ratios (FGR) in isotropic turbulence. An array of invertible filters are employed, encompassing Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The significance of FGR becomes evident, as it acts as a pivotal factor in error control for precise SFS stress prediction. When FGR is set to 1, the DDM models cannot accurately reconstruct the SFS stress due to the insufficient resolution of SFS dynamics. Notably, prediction capabilities are enhanced at an FGR of 2, resulting in accurate SFS stress reconstruction, except for cases involving Helmholtz I and II filters. A remarkable precision close to 100% is achieved at an FGR of 4 for all DDM models. Additionally, the further exploration extends to the filter anisotropy to address its impact on the SFS dynamics and LES accuracy. By employing dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with the anisotropic filter, aspect ratios (AR) ranging from 1 to 16 in LES filters are evaluated. The findings highlight the DDM's proficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. High correlation coefficients exceeding 90% are observed in the a priori study for the DDM's reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as lter anisotropy increases. In the a posteriori studies, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, encompassing velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strain-rate tensors, and SFS stress. It is observed that as filter anisotropy intensify, the results of DSM and DMM become worse, while the DDM continues to deliver satisfactory results across all filter-anisotropy scenarios. The findings emphasize the DDM framework's potential as a valuable tool for advancing the development of sophisticated SFS models for LES of turbulence.Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence
Procedia PDF Downloads 75711 Computed Tomography Guided Bone Biopsies: Experience at an Australian Metropolitan Hospital
Authors: K. Hinde, R. Bookun, P. Tran
Abstract:
Percutaneous CT guided biopsies provide a fast, minimally invasive, cost effective and safe method for obtaining tissue for histopathology and culture. Standards for diagnostic yield vary depending on whether the tissue is being obtained for histopathology or culture. We present a retrospective audit from Western Health in Melbourne Australia over a 12-month period which aimed to determine the diagnostic yield, technical success and complication rate for CT guided bone biopsies and identify factors affecting these results. The digital imaging storage program (Synapse Picture Archiving and Communication System – Fujifilm Australia) was analysed with key word searches from October 2015 to October 2016. Nineteen CT guided bone biopsies were performed during this time. The most common referring unit was oncology, work up imaging included CT, MRI, bone scan and PET scan. The complication rate was 0%, overall diagnostic yield was 74% with a technical success of 95%. When performing biopsies for histologic analysis diagnostic yield was 85% and when performing biopsies for bacterial culture diagnostic yield was 60%. There was no significant relationship identified between size of lesion, distance of lesion to skin, lesion appearance on CT, the number of samples taken or gauge of needle to diagnostic yield or technical success. CT guided bone biopsy at Western Health meets the standard reported at other major clinical centres for technical success and safety. It is a useful investigation in identification of primary malignancy in distal bone metastases.Keywords: bone biopsy, computed tomography, core biopsy, histopathology
Procedia PDF Downloads 200710 Anyword: A Digital Marketing Tool to Increase Productivity in Newly Launching Businesses
Authors: Jana Atteah, Wid Jan, Yara AlHibshi, Rahaf AlRougi
Abstract:
Anyword is an AI copywriting tool that helps marketers create effective campaigns for specific audiences. It offers a wide range of templates for various platforms, brand voice guidelines, and valuable analytics insights. Anyword is used by top global companies and has been recognized as one of the "Fastest Growing Products" in the 2023 software awards. A recent study examined the utilization and impact of AI-powered writing tools, specifically focusing on the adoption of AI in writing pursuits and the use of the Anyword platform. The results indicate that a majority of respondents (52.17%) had not previously used Anyword, but those who had were generally satisfied with the platform. Notable productivity improvements were observed among 13% of the participants, while an additional 34.8% reported a slight increase in productivity. A majority (47.8%) maintained a neutral stance, suggesting that their productivity remained unaffected. Only a minimal percentage (4.3%) claimed that their productivity did not improve with the usage of Anyword AI. In terms of the quality of written content generated, the participants responded positively. Approximately 91% of participants gave Anyword AI a score of 5 or higher, with roughly 17% giving it a perfect score. A small percentage (approximately 9%) gave a low score between 0-2. The mode result was a score of 7, indicating a generally positive perception of the quality of content generated using Anyword AI. These findings suggest that AI can contribute to increased productivity and positively influence the quality of written content. Further research and exploration of AI tools in writing pursuits are warranted to fully understand their potential and limitations.Keywords: artificial intelligence, marketing platforms, productivity, user interface
Procedia PDF Downloads 63709 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights
Authors: Julian Wise
Abstract:
Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.Keywords: mineral technology, big data, machine learning operations, data lake
Procedia PDF Downloads 112708 Study of the Kinetics of Formation of Carboxylic Acids Using Ion Chromatography during Oxidation Induced by Rancimat of the Oleic Acid, Linoleic Acid, Linolenic Acid, and Biodiesel
Authors: Patrícia T. Souza, Marina Ansolin, Eduardo A. C. Batista, Antonio J. A. Meirelles, Matthieu Tubino
Abstract:
Lipid oxidation is a major cause of the deterioration of the quality of the biodiesel, because the waste generated damages the engines. Among the main undesirable effects are the increase of viscosity and acidity, leading to the formation of insoluble gums and sediments which cause the blockage of fuel filters. The auto-oxidation is defined as the spontaneous reaction of atmospheric oxygen with lipids. Unsaturated fatty acids are usually the components affected by such reactions. They are present as free fatty acids, fatty esters and glycerides. To determine the oxidative stability of biodiesels, through the induction period, IP, the Rancimat method is used, which allows continuous monitoring of the induced oxidation process of the samples. During the oxidation of the lipids, volatile organic acids are produced as byproducts, in addition, other byproducts, including alcohols and carbonyl compounds, may be further oxidized to carboxylic acids. By the methodology developed in this work using ion chromatography, IC, analyzing the water contained in the conductimetric vessel, were quantified organic anions of carboxylic acids in samples subjected to oxidation induced by Rancimat. The optimized chromatographic conditions were: eluent water:acetone (80:20 v/v) with 0.5 mM sulfuric acid; flow rate 0.4 mL min-1; injection volume 20 µL; eluent suppressor 20 mM LiCl; analytical curve from 1 to 400 ppm. The samples studied were methyl biodiesel from soybean oil and unsaturated fatty acids standards: oleic, linoleic and linolenic. The induced oxidation kinetics curves were constructed by analyzing the water contained in the conductimetric vessels which were removed, each one, from the Rancimat apparatus at prefixed intervals of time. About 3 g of sample were used under the conditions of 110 °C and air flow rate of 10 L h-1. The water of each conductimetric Rancimat measuring vessel, where the volatile compounds were collected, was filtered through a 0.45 µm filter and analyzed by IC. Through the kinetic data of the formation of the organic anions of carboxylic acids, the formation rates of the same were calculated. The observed order of the rates of formation of the anions was: formate >>> acetate > hexanoate > valerate for the oleic acid; formate > hexanoate > acetate > valerate for the linoleic acid; formate >>> valerate > acetate > propionate > butyrate for the linolenic acid. It is possible to suppose that propionate and butyrate are obtained mainly from linolenic acid and that hexanoate is originated from oleic and linoleic acid. For the methyl biodiesel the order of formation of anions was: formate >>> acetate > valerate > hexanoate > propionate. According to the total rate of formation these anions produced during the induced degradation of the fatty acids can be assigned the order of reactivity: linolenic acid > linoleic acid >>> oleic acid.Keywords: anions of carboxylic acids, biodiesel, ion chromatography, oxidation
Procedia PDF Downloads 475707 Causes and Consequences of Unauthorized Use of Books: Readers, Authors, and Publishers' Perspective
Authors: Arūnas Gudinavičius, Vincas Grigas
Abstract:
Purpose: The current study aims to identify and explore causes and consequences of unauthorized use of books from readers’, publishers’, and authors’ points of view. The case of Lithuania also assessed, especially historical background (banned alphabet, book smuggling, theft as the social norm in Soviet times) of the country. Design/methodology/approach: Aiming for more understanding why readers, authors and publishers are using or not using technology for unauthorized access of books, technology acceptance model approach was used, a total of 30 respondents (publishers, authors and readers) were interviewed in semi-structured face-to-face interviews and thematic analysis of collected qualitative data was conducted. Interviews were coded in English with coding software for further analysis. Findings: Findings indicate that the main cause for the unauthorized use of books is a lack of legal e-book titles and acquisition options. This mainly points at publishers, however, instead of using unauthorized sources as opportunities for author promotion or marketing, they rather concentrate on the causes of unauthorized use of books which they are not in control of, including access to unauthorized sources, habits, and economic causes. Some publishers believe that the lack of legal e-book titles is the consequence of unauthorized use of book rather than its cause. Originality: This research contributed to the body of knowledge by investigating unauthorized use of books from readers’, publishers’, and authors’ points of view which renders to have a better understanding of the causes and consequences of such behavior, as well as differences between these roles. We suggest that these causes lead to the intention to use and actual use of technology which is easier to use and which gives more perceived advantages – technology for unauthorized downloading and reading of books vs legal e-book acquisition options.Keywords: digital piracy, unauthorized access, publishing industry, book reader, intellectual property rights
Procedia PDF Downloads 172706 Entrepreneur Universal Education System: Future Evolution
Authors: Khaled Elbehiery, Hussam Elbehiery
Abstract:
The success of education is dependent on evolution and adaptation, while the traditional system has worked before, one type of education evolved with the digital age is virtual education that has influenced efficiency in today’s learning environments. Virtual learning has indeed proved its efficiency to overcome the drawbacks of the physical environment such as time, facilities, location, etc., but despite what it had accomplished, the educational system over all is not adequate for being a productive system yet. Earning a degree is not anymore enough to obtain a career job; it is simply missing the skills and creativity. There are always two sides of a coin; a college degree or a specialized certificate, each has its own merits, but having both can put you on a successful IT career path. For many of job-seeking individuals across world to have a clear meaningful goal for work and education and positively contribute the community, a productive correlation and cooperation among employers, universities alongside with the individual technical skills is a must for generations to come. Fortunately, the proposed research “Entrepreneur Universal Education System” is an evolution to meet the needs of both employers and students, in addition to gaining vital and real-world experience in the chosen fields is easier than ever. The new vision is to empower the education to improve organizations’ needs which means improving the world as its primary goal, adopting universal skills of effective thinking, effective action, effective relationships, preparing the students through real-world accomplishment and encouraging them to better serve their organization and their communities faster and more efficiently.Keywords: virtual education, academic degree, certificates, internship, amazon web services, Microsoft Azure, Google Cloud Platform, hybrid models
Procedia PDF Downloads 96705 Socio-Motor Experience between Affectivity and Movement from Harry Potter to Lord of the Rings
Authors: Manuela Gamba, Niki Mandolesi
Abstract:
Teenagers today have little knowledge about how to move or play together. The adults who are part of sports culture must find an effective way to foster this essential ability. Our research in Italy uses a 'holistic model' based on fantasy literature to explore the relationships between the game identities and self-identities of young people and the achievement of psycho-motor, emotional and social well-being in the realms of sport and education. Physical activity projects were carried out in schools and extra-curricular associations in Rome, combining outdoor activities and distance learning. This holistic and malleable game model is inspired by fantasy accounts of the journeys taken in The Lord of Rings and Harry Potter books. We know that many have a lot of resistance to the idea of using fantasy and play as a pedagogical tool, but the results obtained in this experience are surprising. Our interventions and investigations focused on promoting self-esteem, awareness, a sense of belonging, social integration, cooperation, well-being, and informed decision making: a basis for healthy and effective citizenship. For teenagers, creative thinking is the right stimulus to involve and compare the story of characters to their own journey through social and self-reflective identity analysis. We observed how important it is to engage students emotionally as well as cognitively and that enabling them to play with identity through relationships with peers. There is a need today for a multidisciplinary synthesis of analog and digital values, especially in response to recent distance-living experiences. There is a need for a global reconceptualization of free time and nature in the human experience.Keywords: awareness, creativity, identity, play
Procedia PDF Downloads 190704 Cultural Works Interacting with the Generational Aesthetic Gap between Gen X and Gen Z in China: A Qualitative Study
Authors: Qianyu Zhang
Abstract:
The spread of digital technology in China has worsened the generation gap and intergenerational competition for cultural and aesthetic discourse. Meanwhile, the increased accessibility of cultural works has encouraged the sharing and inheritance of collective cultural memories between generations. However, not each cultural work can engage positively with efforts to bridge intergenerational aesthetic differences. This study argues that in contemporary China, where new media and the Internet are widely available, featured cultural works have more potential to help enhance the cultural aesthetic consensus among different generations, thus becoming an effective countermeasure to narrow the intergenerational aesthetic rift and cultural discontinuity. Specifically, the generational aesthetic gap is expected to be bridged or improved through the shared appreciation or consumption of cultural works that meet certain conditions by several generations. In-depth interviews of Gen X and Gen Z (N=15, respectively) in China uncovered their preferences and commonalities for cultural works and shared experiences in appreciating them. Results demonstrate that both generations’ shared appreciation of cultural work is a necessary but insufficient condition for its effective response to the generational aesthetic gap. Coding analysis rendered six dimensions that cultural works with the potential to bridge the intergenerational aesthetic divide should satisfy simultaneously: genre, theme, content, elements, quality, and accessibility. Cultural works that engage multiple senses/ compound realistic, domestic and contemporary cultural memories/ contain the narrative of family life and nationalism/ include more elements familiar to the previous generation/ are superb-produced and unaffected/ are more accessible better promote intergenerational aesthetic exchange and value recognition. Moreover, compared to the dilemma of the previous generation facing the aesthetic gap, the later generation plays a crucial role in bridging the generational aesthetic divide.Keywords: cultural works, generation gap, generation X, generation Z, cultural memory
Procedia PDF Downloads 153703 Using the SMT Solver to Minimize the Latency and to Optimize the Number of Cores in an NoC-DSP Architectures
Authors: Imen Amari, Kaouther Gasmi, Asma Rebaya, Salem Hasnaoui
Abstract:
The problem of scheduling and mapping data flow applications on multi-core architectures is notoriously difficult. This difficulty is related to the rapid evaluation of Telecommunication and multimedia systems accompanied by a rapid increase of user requirements in terms of latency, execution time, consumption, energy, etc. Having an optimal scheduling on multi-cores DSP (Digital signal Processors) platforms is a challenging task. In this context, we present a novel technic and algorithm in order to find a valid schedule that optimizes the key performance metrics particularly the Latency. Our contribution is based on Satisfiability Modulo Theories (SMT) solving technologies which is strongly driven by the industrial applications and needs. This paper, describe a scheduling module integrated in our proposed Workflow which is advised to be a successful approach for programming the applications based on NoC-DSP platforms. This workflow transform automatically a Simulink model to a synchronous dataflow (SDF) model. The automatic transformation followed by SMT solver scheduling aim to minimize the final latency and other software/hardware metrics in terms of an optimal schedule. Also, finding the optimal numbers of cores to be used. In fact, our proposed workflow taking as entry point a Simulink file (.mdl or .slx) derived from embedded Matlab functions. We use an approach which is based on the synchronous and hierarchical behavior of both Simulink and SDF. Whence, results of running the scheduler which exist in the Workflow mentioned above using our proposed SMT solver algorithm refinements produce the best possible scheduling in terms of latency and numbers of cores.Keywords: multi-cores DSP, scheduling, SMT solver, workflow
Procedia PDF Downloads 286702 Chemical Analysis of Particulate Matter (PM₂.₅) and Volatile Organic Compound Contaminants
Authors: S. Ebadzadsahraei, H. Kazemian
Abstract:
The main objective of this research was to measure particulate matter (PM₂.₅) and Volatile Organic Compound (VOCs) as two classes of air pollutants, at Prince George (PG) neighborhood in warm and cold seasons. To fulfill this objective, analytical protocols were developed for accurate sampling and measurement of the targeted air pollutants. PM₂.₅ samples were analyzed for their chemical composition (i.e., toxic trace elements) in order to assess their potential source of emission. The City of Prince George, widely known as the capital of northern British Columbia (BC), Canada, has been dealing with air pollution challenges for a long time. The city has several local industries including pulp mills, a refinery, and a couple of asphalt plants that are the primary contributors of industrial VOCs. In this research project, which is the first study of this kind in this region it measures physical and chemical properties of particulate air pollutants (PM₂.₅) at the city neighborhood. Furthermore, this study quantifies the percentage of VOCs at the city air samples. One of the outcomes of this project is updated data about PM₂.₅ and VOCs inventory in the selected neighborhoods. For examining PM₂.₅ chemical composition, an elemental analysis methodology was developed to measure major trace elements including but not limited to mercury and lead. The toxicity of inhaled particulates depends on both their physical and chemical properties; thus, an understanding of aerosol properties is essential for the evaluation of such hazards, and the treatment of such respiratory and other related diseases. Mixed cellulose ester (MCE) filters were selected for this research as a suitable filter for PM₂.₅ air sampling. Chemical analyses were conducted using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) for elemental analysis. VOCs measurement of the air samples was performed using a Gas Chromatography-Flame Ionization Detector (GC-FID) and Gas Chromatography-Mass Spectrometry (GC-MS) allowing for quantitative measurement of VOC molecules in sub-ppb levels. In this study, sorbent tube (Anasorb CSC, Coconut Charcoal), 6 x 70-mm size, 2 sections, 50/100 mg sorbent, 20/40 mesh was used for VOCs air sampling followed by using solvent extraction and solid-phase micro extraction (SPME) techniques to prepare samples for measuring by a GC-MS/FID instrument. Air sampling for both PM₂.₅ and VOC were conducted in summer and winter seasons for comparison. Average concentrations of PM₂.₅ are very different between wildfire and daily samples. At wildfire time average of concentration is 83.0 μg/m³ and daily samples are 23.7 μg/m³. Also, higher concentrations of iron, nickel and manganese found at all samples and mercury element is found in some samples. It is able to stay too high doses negative effects.Keywords: air pollutants, chemical analysis, particulate matter (PM₂.₅), volatile organic compound, VOCs
Procedia PDF Downloads 142701 Cybersecurity Assessment of Decentralized Autonomous Organizations in Smart Cities
Authors: Claire Biasco, Thaier Hayajneh
Abstract:
A smart city is the integration of digital technologies in urban environments to enhance the quality of life. Smart cities capture real-time information from devices, sensors, and network data to analyze and improve city functions such as traffic analysis, public safety, and environmental impacts. Current smart cities face controversy due to their reliance on real-time data tracking and surveillance. Internet of Things (IoT) devices and blockchain technology are converging to reshape smart city infrastructure away from its centralized model. Connecting IoT data to blockchain applications would create a peer-to-peer, decentralized model. Furthermore, blockchain technology powers the ability for IoT device data to shift from the ownership and control of centralized entities to individuals or communities with Decentralized Autonomous Organizations (DAOs). In the context of smart cities, DAOs can govern cyber-physical systems to have a greater influence over how urban services are being provided. This paper will explore how the core components of a smart city now apply to DAOs. We will also analyze different definitions of DAOs to determine their most important aspects in relation to smart cities. Both categorizations will provide a solid foundation to conduct a cybersecurity assessment of DAOs in smart cities. It will identify the benefits and risks of adopting DAOs as they currently operate. The paper will then provide several mitigation methods to combat cybersecurity risks of DAO integrations. Finally, we will give several insights into what challenges will be faced by DAO and blockchain spaces in the coming years before achieving a higher level of maturity.Keywords: blockchain, IoT, smart city, DAO
Procedia PDF Downloads 121700 Effectiveness of Myofascial Release Technique in Treatment of Sacroiliac Joint Hypo-Mobility in Postnatal Women
Authors: Ahmed A. Abd El Rahim, Mohamed M. M. Essa, Magdy M. A. Shabana, Said A. Mohamed, Mohamed Ibrahim Mabrouk
Abstract:
Background: Sacroiliac joint (SIJ) dysfunction is considered the main cause of pregnancy-related back pain, which may continue to persist postnatally. Myofascial release technique (MFR) is an application of low-intensity, prolonged stretch to myofascial structures to improve function by increasing the sliding properties of restricted myofascial tissues. Purpose: This study was designed to investigate the effect of MFR on postnatal SIJ hypo-mobility. Materials and Methods: Fifty postnatal women complaining of SIJ hypo-mobility participated in this study. Their ages ranged from 26 to 35 yrs., and their body mass index (BMI) didn`t exceed 30 kg/m2. They were randomly assigned to two equal groups, group A (Gr. A) and group B (Gr. B). Both groups received three sessions per week for eight successive weeks. Gr. A received a traditional physical therapy program, while Gr. B received a traditional physical therapy program in addition to MFR. Doppler imaging of vibration was utilized to measure SIJ mobility pre- and post-intervention, and an electronic digital goniometer was used to measure back flexion and extension Range of motion. Results: Findings revealed a statistical improvement in post-intervention values of SIJ mobility in addition to trunk flexion and extension ROM in Gr. B compared to Gr. A (P<0.001). Conclusion: Adding MFR to traditional physical therapy programs is highly recommended in the treatment of SIJ hypo-mobility in postnatal women.Keywords: sacroiliac hypo-mobility, sacroiliac dysfunction, myofascial release technique, traditional physical therapy, postnatal
Procedia PDF Downloads 102699 Laser Registration and Supervisory Control of neuroArm Robotic Surgical System
Authors: Hamidreza Hoshyarmanesh, Hosein Madieh, Sanju Lama, Yaser Maddahi, Garnette R. Sutherland, Kourosh Zareinia
Abstract:
This paper illustrates the concept of an algorithm to register specified markers on the neuroArm surgical manipulators, an image-guided MR-compatible tele-operated robot for microsurgery and stereotaxy. Two range-finding algorithms, namely time-of-flight and phase-shift, are evaluated for registration and supervisory control. The time-of-flight approach is implemented in a semi-field experiment to determine the precise position of a tiny retro-reflective moving object. The moving object simulates a surgical tool tip. The tool is a target that would be connected to the neuroArm end-effector during surgery inside the magnet bore of the MR imaging system. In order to apply flight approach, a 905-nm pulsed laser diode and an avalanche photodiode are utilized as the transmitter and receiver, respectively. For the experiment, a high frequency time to digital converter was designed using a field-programmable gate arrays. In the phase-shift approach, a continuous green laser beam with a wavelength of 530 nm was used as the transmitter. Results showed that a positioning error of 0.1 mm occurred when the scanner-target point distance was set in the range of 2.5 to 3 meters. The effectiveness of this non-contact approach exhibited that the method could be employed as an alternative for conventional mechanical registration arm. Furthermore, the approach is not limited by physical contact and extension of joint angles.Keywords: 3D laser scanner, intraoperative MR imaging, neuroArm, real time registration, robot-assisted surgery, supervisory control
Procedia PDF Downloads 286698 Melt–Electrospun Polyprophylene Fabrics Functionalized with TiO2 Nanoparticles for Effective Photocatalytic Decolorization
Authors: Z. Karahaliloğlu, C. Hacker, M. Demirbilek, G. Seide, E. B. Denkbaş, T. Gries
Abstract:
Currently, textile industry has played an important role in world’s economy, especially in developing countries. Dyes and pigments used in textile industry are significant pollutants. Most of theirs are azo dyes that have chromophore (-N=N-) in their structure. There are many methods for removal of the dyes from wastewater such as chemical coagulation, flocculation, precipitation and ozonation. But these methods have numerous disadvantages and alternative methods are needed for wastewater decolorization. Titanium-mediated photodegradation has been used generally due to non-toxic, insoluble, inexpensive, and highly reactive properties of titanium dioxide semiconductor (TiO2). Melt electrospinning is an attractive manufacturing process for thin fiber production through electrospinning from PP (Polyprophylene). PP fibers have been widely used in the filtration due to theirs unique properties such as hydrophobicity, good mechanical strength, chemical resistance and low-cost production. In this study, we aimed to investigate the effect of titanium nanoparticle localization and amine modification on the dye degradation. The applicability of the prepared chemical activated composite and pristine fabrics for a novel treatment of dyeing wastewater were evaluated.In this study, a photocatalyzer material was prepared from nTi (titanium dioxide nanoparticles) and PP by a melt-electrospinning technique. The electrospinning parameters of pristine PP and PP/nTi nanocomposite fabrics were optimized. Before functionalization with nTi, the surface of fabrics was activated by a technique using glutaraldehyde (GA) and polyethyleneimine to promote the dye degredation. Pristine PP and PP/nTi nanocomposite melt-electrospun fabrics were characterized using scanning electron microscopy (SEM) and X-Ray Photon Spectroscopy (XPS). Methyl orange (MO) was used as a model compound for the decolorization experiments. Photocatalytic performance of nTi-loaded pristine and nanocomposite melt-electrospun filters was investigated by varying initial dye concentration 10, 20, 40 mg/L). nTi-PP composite fabrics were successfully processed into a uniform, fibrous network of beadless fibers with diameters of 800±0.4 nm. The process parameters were determined as a voltage of 30 kV, a working distance of 5 cm, a temperature of the thermocouple and hotcoil of 260–300 ºC and a flow rate of 0.07 mL/h. SEM results indicated that TiO2 nanoparticles were deposited uniformly on the nanofibers and XPS results confirmed the presence of titanium nanoparticles and generation of amine groups after modification. According to photocatalytic decolarization test results, nTi-loaded GA-treated pristine or nTi-PP nanocomposite fabric filtern have superior properties, especially over 90% decolorization efficiency at GA-treated pristine and nTi-PP composite PP fabrics. In this work, as a photocatalyzer for wastewater treatment, surface functionalized with nTi melt-electrospun fabrics from PP were prepared. Results showed melt-electrospun nTi-loaded GA-tretaed composite or pristine PP fabrics have a great potential for use as a photocatalytic filter to decolorization of wastewater and thus, requires further investigation.Keywords: titanium oxide nanoparticles, polyprophylene, melt-electrospinning
Procedia PDF Downloads 267697 Neural Network Based Control Algorithm for Inhabitable Spaces Applying Emotional Domotics
Authors: Sergio A. Navarro Tuch, Martin Rogelio Bustamante Bello, Leopoldo Julian Lechuga Lopez
Abstract:
In recent years, Mexico’s population has seen a rise of different physiological and mental negative states. Two main consequences of this problematic are deficient work performance and high levels of stress generating and important impact on a person’s physical, mental and emotional health. Several approaches, such as the use of audiovisual stimulus to induce emotions and modify a person’s emotional state, can be applied in an effort to decreases these negative effects. With the use of different non-invasive physiological sensors such as EEG, luminosity and face recognition we gather information of the subject’s current emotional state. In a controlled environment, a subject is shown a series of selected images from the International Affective Picture System (IAPS) in order to induce a specific set of emotions and obtain information from the sensors. The raw data obtained is statistically analyzed in order to filter only the specific groups of information that relate to a subject’s emotions and current values of the physical variables in the controlled environment such as, luminosity, RGB light color, temperature, oxygen level and noise. Finally, a neural network based control algorithm is given the data obtained in order to feedback the system and automate the modification of the environment variables and audiovisual content shown in an effort that these changes can positively alter the subject’s emotional state. During the research, it was found that the light color was directly related to the type of impact generated by the audiovisual content on the subject’s emotional state. Red illumination increased the impact of violent images and green illumination along with relaxing images decreased the subject’s levels of anxiety. Specific differences between men and women were found as to which type of images generated a greater impact in either gender. The population sample was mainly constituted by college students whose data analysis showed a decreased sensibility to violence towards humans. Despite the early stage of the control algorithm, the results obtained from the population sample give us a better insight into the possibilities of emotional domotics and the applications that can be created towards the improvement of performance in people’s lives. The objective of this research is to create a positive impact with the application of technology to everyday activities; nonetheless, an ethical problem arises since this can also be applied to control a person’s emotions and shift their decision making.Keywords: data analysis, emotional domotics, performance improvement, neural network
Procedia PDF Downloads 140696 Identifying Biomarker Response Patterns to Vitamin D Supplementation in Type 2 Diabetes Using K-means Clustering: A Meta-Analytic Approach to Glycemic and Lipid Profile Modulation
Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei
Abstract:
Background and Aims: This meta-analysis aimed to evaluate the effect of vitamin D supplementation on key metabolic and cardiovascular parameters, such as glycated hemoglobin (HbA1C), fasting blood sugar (FBS), low-density lipoprotein (LDL), high-density lipoprotein (HDL), systolic blood pressure (SBP), and total vitamin D levels in patients with Type 2 diabetes mellitus (T2DM). Methods: A systematic search was performed across databases, including PubMed, Scopus, Embase, Web of Science, Cochrane Library, and ClinicalTrials.gov, from January 1990 to January 2024. A total of 4,177 relevant studies were initially identified. Using an unsupervised K-means clustering algorithm, publications were grouped based on common text features. Maximum entropy classification was then applied to filter studies that matched a pre-identified training set of 139 potentially relevant articles. These selected studies were manually screened for relevance. A parallel manual selection of all initially searched studies was conducted for validation. The final inclusion of studies was based on full-text evaluation, quality assessment, and meta-regression models using random effects. Sensitivity analysis and publication bias assessments were also performed to ensure robustness. Results: The unsupervised K-means clustering algorithm grouped the patients based on their responses to vitamin D supplementation, using key biomarkers such as HbA1C, FBS, LDL, HDL, SBP, and total vitamin D levels. Two primary clusters emerged: one representing patients who experienced significant improvements in these markers and another showing minimal or no change. Patients in the cluster associated with significant improvement exhibited lower HbA1C, FBS, and LDL levels after vitamin D supplementation, while HDL and total vitamin D levels increased. The analysis showed that vitamin D supplementation was particularly effective in reducing HbA1C, FBS, and LDL within this cluster. Furthermore, BMI, weight gain, and disease duration were identified as factors that influenced cluster assignment, with patients having lower BMI and shorter disease duration being more likely to belong to the improvement cluster. Conclusion: The findings of this machine learning-assisted meta-analysis confirm that vitamin D supplementation can significantly improve glycemic control and reduce the risk of cardiovascular complications in T2DM patients. The use of automated screening techniques streamlined the process, ensuring the comprehensive evaluation of a large body of evidence while maintaining the validity of traditional manual review processes.Keywords: HbA1C, T2DM, SBP, FBS
Procedia PDF Downloads 12695 Facilitating Conditions Mediating SME’s Intention to Use Social Media for Knowledge Sharing
Authors: Stevens Phaphadi Mamorobela
Abstract:
The Covid-19 pandemic has accelerated the use of social media in SMEs to stay abreast with information about the latest news and developments and to predict the future world of business. The national shutdown regulations for curbing the spread of the Covid-19 virus resulted in SMEs having to distribute large volumes of information through social media platforms to collaborate and conduct business remotely. How much of the information shared on social media is used by SMEs as significant knowledge for economic rent is yet to be known. This study aims to investigate the facilitating conditions that enable SMEs’ intention to use social media as a knowledge-sharing platform to create economic rent and to cope with the Covid-19 challenges. A qualitative research approach was applied where semi-structured interviews were conducted with 13 SME owners located in the Gauteng province in South Africa to identify and explain the facilitating conditions of SMEs towards their intention to use social media as a knowledge-sharing tool in the Covid-19 era. The study discovered that the national lockdown regulations towards curbing the spread of the Covid-19 pandemic had compelled SMEs to adopt digital technologies that enabled them to quickly transform their business processes to cope with the challenges of the pandemic. The facilitating conditions, like access to high bandwidth internet coverage in the Gauteng region, enable SMEs to have strong intentions to use social media to distribute content and to reach out to their target market. However, the content is shared informally using diverse social media platforms without any guidelines for transforming content into rent-yielding knowledge.Keywords: facilitating conditions, knowledge sharing, social media, intention to use, SME
Procedia PDF Downloads 106694 Communicative Competence in French Language for Nigerian Teacher-Trainees in the New-Normal Society Using Mobile Apps as a Lifelong Learning Tool
Authors: Olukemi E. Adetuyi-Olu-Francis
Abstract:
Learning is natural for living. One stops learning when life ends. Hence, there is no negotiating life-long learning. An individual has the innate ability to learn as many languages as he/she desires as long as life exists. French language education to every Nigerian teacher-trainee is a necessity. Nigeria’s geographical location requires that the French language should be upheld for economic and cultural co-operations between Nigeria and the francophone countries sharing borders with her. The French language will enhance the leadership roles of the teacher-trainees and their ability to function across borders. The 21st century learning tools are basically digital, and many apps are complementing the actual classroom interactions. This study examined the communicative competence in the French language to equip Nigerian teacher-trainees in the new-normal society using mobile apps as a lifelong learning tool. Three research questions and hypotheses guided the study, and the researcher adopted a pre-test, a post-test experimental design, using a sample size of 87 teacher-trainees in South-south geopolitical zone of Nigeria. Results showed that the use of mobile apps is effective for learning the French language. One of the recommendations is that the use of mobile apps should be encouraged for all Nigerian youths to learn the French language for enhancing leadership roles in the world of work and for international interactions for socio-economic co-operations with Nigerian neighboring countries.Keywords: communicative competence, french language, life long learning, mobile apps, new normal society, teacher trainees
Procedia PDF Downloads 236693 Assessing the Impact of Autonomous Vehicles on Supply Chain Performance – A Case Study of Agri-Food Supply Chain
Authors: Nitish Suvarna, Anjali Awasthi
Abstract:
In an era marked by rapid technological advancements, the integration of Autonomous Vehicles into supply chain networks represents a transformative shift, promising to redefine the paradigms of logistics and transportation. This thesis delves into a comprehensive assessment of the impact of autonomous vehicles on supply chain performance, with a particular focus on network design, operational efficiency, and environmental sustainability. Employing the advanced simulation capabilities of anyLogistix (ALX), the study constructs a digital twin of a conventional supply chain network, encompassing suppliers, production facilities, distribution centers, and customer endpoints. The research methodically integrates Autonomous Vehicles into this intricate network, aiming to unravel the multifaceted effects on transportation logistics including transit times, cost-efficiency, and sustainability. Through simulations and scenarios analysis, the study scrutinizes the operational resilience and adaptability of supply chains in the face of dynamic market conditions and disruptive technologies like Autonomous Vehicles. Furthermore, the thesis undertakes carbon footprint analysis, quantifying the environmental benefits and challenges associated with the adoption of Autonomous Vehicles in supply chain operations. The insights from this research are anticipated to offer a strategic framework for industry stakeholders, guiding the adoption of Autonomous Vehicles to foster a more efficient, responsive, and sustainable supply chain ecosystem. The findings aim to serve as a cornerstone for future research and practical implementations in the realm of intelligent transportation and supply chain management.Keywords: autonomous vehicle, agri-food supply chain, ALX simulation, anyLogistix
Procedia PDF Downloads 75692 Photomicrograph-Based Neuropathology Consultation in Tanzania; The Utility of Static-Image Neurotelepathology in Low- And Middle-Income Countries
Authors: Francis Zerd, Brian E. Moore, Atuganile E. Malango, Patrick W. Hosokawa, Kevin O. Lillehei, Laurence Lemery Mchome, D. Ryan Ormond
Abstract:
Introduction: Since neuropathologic diagnosis in the developing world is hampered by limitations in technical infrastructure, trained laboratory personnel, and subspecialty-trained pathologists, the use of telepathology for diagnostic support, second-opinion consultations, and ongoing training holds promise as a means of addressing these challenges. This research aims to assess the utility of static teleneuropathology in improving neuropathologic diagnoses in low- and middle-income countries. Methods: Consecutive neurosurgical biopsy and resection specimens obtained at Muhimbili National Hospital in Tanzania between July 1, 2018, and June 30, 2019, were selected for retrospective, blinded static-image neuropathologic review followed by on-site review by an expert neuropathologist. Results: A total of 75 neuropathologic cases were reviewed. The agreement of static images and on-site glass diagnosis was 71% with strict criteria and 88% with less stringent criteria. This represents an overall improvement in diagnostic accuracy from 36% by general pathologists to 71% by a neuropathologist using static telepathology (or 76% to 88% with less stringent criteria). Conclusions: Telepathology offers a suitable means of providing diagnostic support, second-opinion consultations, and ongoing training to pathologists practicing in resource-limited countries. Moreover, static digital teleneuropathology is an uncomplicated, cost-effective, and reliable way to achieve these goals.Keywords: neuropathology, resource-limited settings, static image, Tanzania, teleneuropathology
Procedia PDF Downloads 102691 Mammographic Multi-View Cancer Identification Using Siamese Neural Networks
Authors: Alisher Ibragimov, Sofya Senotrusova, Aleksandra Beliaeva, Egor Ushakov, Yuri Markin
Abstract:
Mammography plays a critical role in screening for breast cancer in women, and artificial intelligence has enabled the automatic detection of diseases in medical images. Many of the current techniques used for mammogram analysis focus on a single view (mediolateral or craniocaudal view), while in clinical practice, radiologists consider multiple views of mammograms from both breasts to make a correct decision. Consequently, computer-aided diagnosis (CAD) systems could benefit from incorporating information gathered from multiple views. In this study, the introduce a method based on a Siamese neural network (SNN) model that simultaneously analyzes mammographic images from tri-view: bilateral and ipsilateral. In this way, when a decision is made on a single image of one breast, attention is also paid to two other images – a view of the same breast in a different projection and an image of the other breast as well. Consequently, the algorithm closely mimics the radiologist's practice of paying attention to the entire examination of a patient rather than to a single image. Additionally, to the best of our knowledge, this research represents the first experiments conducted using the recently released Vietnamese dataset of digital mammography (VinDr-Mammo). On an independent test set of images from this dataset, the best model achieved an AUC of 0.87 per image. Therefore, this suggests that there is a valuable automated second opinion in the interpretation of mammograms and breast cancer diagnosis, which in the future may help to alleviate the burden on radiologists and serve as an additional layer of verification.Keywords: breast cancer, computer-aided diagnosis, deep learning, multi-view mammogram, siamese neural network
Procedia PDF Downloads 138690 Predicting Susceptibility to Coronary Artery Disease using Single Nucleotide Polymorphisms with a Large-Scale Data Extraction from PubMed and Validation in an Asian Population Subset
Authors: K. H. Reeta, Bhavana Prasher, Mitali Mukerji, Dhwani Dholakia, Sangeeta Khanna, Archana Vats, Shivam Pandey, Sandeep Seth, Subir Kumar Maulik
Abstract:
Introduction Research has demonstrated a connection between coronary artery disease (CAD) and genetics. We did a deep literature mining using both bioinformatics and manual efforts to identify the susceptible polymorphisms in coronary artery disease. Further, the study sought to validate these findings in an Asian population. Methodology In first phase, we used an automated pipeline which organizes and presents structured information on SNPs, Population and Diseases. The information was obtained by applying Natural Language Processing (NLP) techniques to approximately 28 million PubMed abstracts. To accomplish this, we utilized Python scripts to extract and curate disease-related data, filter out false positives, and categorize them into 24 hierarchical groups using named Entity Recognition (NER) algorithms. From the extensive research conducted, a total of 466 unique PubMed Identifiers (PMIDs) and 694 Single Nucleotide Polymorphisms (SNPs) related to coronary artery disease (CAD) were identified. To refine the selection process, a thorough manual examination of all the studies was carried out. Specifically, SNPs that demonstrated susceptibility to CAD and exhibited a positive Odds Ratio (OR) were selected, and a final pool of 324 SNPs was compiled. The next phase involved validating the identified SNPs in DNA samples of 96 CAD patients and 37 healthy controls from Indian population using Global Screening Array. ResultsThe results exhibited out of 324, only 108 SNPs were expressed, further 4 SNPs showed significant difference of minor allele frequency in cases and controls. These were rs187238 of IL-18 gene, rs731236 of VDR gene, rs11556218 of IL16 gene and rs5882 of CETP gene. Prior researches have reported association of these SNPs with various pathways like endothelial damage, susceptibility of vitamin D receptor (VDR) polymorphisms, and reduction of HDL-cholesterol levels, ultimately leading to the development of CAD. Among these, only rs731236 had been studied in Indian population and that too in diabetes and vitamin D deficiency. For the first time, these SNPs were reported to be associated with CAD in Indian population. Conclusion: This pool of 324 SNP s is a unique kind of resource that can help to uncover risk associations in CAD. Here, we validated in Indian population. Further, validation in different populations may offer valuable insights and contribute to the development of a screening tool and may help in enabling the implementation of primary prevention strategies targeted at the vulnerable population.Keywords: coronary artery disease, single nucleotide polymorphism, susceptible SNP, bioinformatics
Procedia PDF Downloads 76689 Satellite Solutions for Koshi Floods
Authors: Sujan Tyata, Alison Shilpakar, Nayan Bakhadyo, Kushal K. C., Abhas Maskey
Abstract:
The Koshi River, acknowledged as the "Sorrow of Bihar," poses intricate challenges characterized by recurrent flooding. Within the Koshi Basin, floods have historically inflicted damage on infrastructure, agriculture, and settlements. The Koshi River exhibits a highly braided pattern across a 48 km stretch to the south of Chatara. The devastating flood from the Koshi River, which began in Nepal's Sunsari District in 2008, led to significant casualties and the destruction of agricultural areas.The catastrophe was exacerbated by a levee breach, underscoring the vulnerability of the region's flood defenses. A comprehensive understanding of environmental changes in the area is unveiled through satellite imagery analysis. This analysis facilitates the identification of high-risk zones and their contributing factors. Employing remote sensing, the analysis specifically pinpoints locations vulnerable to levee breaches. Topographical features of the area along with longitudinal and cross sectional profiles of the river and levee obtained from digital elevation model are used in the hydrological analysis for assessment of flood. To mitigate the impact of floods, the strategy involves the establishment of reservoirs upstream. Leveraging satellite data, optimal locations for water storage are identified. This approach presents a dual opportunity to not only alleviate flood risks but also catalyze the implementation of pumped storage hydropower initiatives. This holistic approach addresses environmental challenges while championing sustainable energy solutions.Keywords: flood mitigation, levee, remote sensing, satellite imagery analysis, sustainable energy solutions
Procedia PDF Downloads 64688 Tracing Digital Traces of Phatic Communion in #Mooc
Authors: Judith Enriquez-Gibson
Abstract:
This paper meddles with the notion of phatic communion introduced 90 years ago by Malinowski, who was a Polish-born British anthropologist. It explores the phatic in Twitter within the contents of tweets related to moocs (massive online open courses) as a topic or trend. It is not about moocs though. It is about practices that could easily be hidden or neglected if we let big or massive topics take the lead or if we simply follow the computational or secret codes behind Twitter itself and third party software analytics. It draws from media and cultural studies. Though at first it appears data-driven as I submitted data collection and analytics into the hands of a third party software, Twitonomy, the aim is to follow how phatic communion might be practised in a social media site, such as Twitter. Lurking becomes its research method to analyse mooc-related tweets. A total of 3,000 tweets were collected on 11 October 2013 (UK timezone). The emphasis of lurking is to engage with Twitter as a system of connectivity. One interesting finding is that a click is in fact a phatic practice. A click breaks the silence. A click in one of the mooc website is actually a tweet. A tweet was posted on behalf of a user who simply chose to click without formulating the text and perhaps without knowing that it contains #mooc. Surely, this mechanism is not about reciprocity. To break the silence, users did not use words. They just clicked the ‘tweet button’ on a mooc website. A click performs and maintains connectivity – and Twitter as the medium in attendance in our everyday, available when needed to be of service. In conclusion, the phatic culture of breaking silence in Twitter does not have to submit to the power of code and analytics. It is a matter of human code.Keywords: click, Twitter, phatic communion, social media data, mooc
Procedia PDF Downloads 412687 Exploring the Changes in the Publishing of Scientific Journals in the Age of Digital Transformation as the Main Measure for Scientific Communication
Authors: Arūnas Gudinavičius
Abstract:
The historiography of scholarly journals in Eastern Europe is fragmented, and so far, the development of scholarly journals in Eastern Europe has not been studied from a publishing point of view in the context of scientific communication. There are only a few general articles on the period before World War II; also, there is hardly any systematic and publicly available information about the Soviet period and the situation of scientific communication in Eastern Europe at the end of the XX century and the beginning of the XXI century. There is a lack of research data on scholarly journals in Lithuania. The existing researches focuses mostly on the specific needs of academic institutions. The publication of scientific journals and papers is analyzed as a part of the scientific communication circle. Formal science communication from the point of view that it is the results formed in the course of communication are examined, which are necessarily characterized by long-term access to a large circle of users. Improved model of scientific communication by supplementing the dissemination of research results with formal and informal communication channels is used, according to which the scientific communication system forms the essence of science, where the social, lasting value of science and the dissemination of scientific results to scientists and the public are the most important. The model covers the science communication process from research initiation to journal publication and citation. We are focusing on the publishing and dissemination stages of the model of scientific communication. The paper is to systematize and analyze the various types of scientific journal publishers from 1907 until 2020 as a means of formal documentary communication in the context of all scientific communication. The research analyses the case of a small European country and presents chronological and geographical characteristics of the publication of scientific periodicals, analyzes the publishers of scientific periodicals and their activities, publishing formats, and dissemination methods.Keywords: scientific communication, scientific periodicals, scientific journals, publishing
Procedia PDF Downloads 77686 Pain Management Program in Helping Community-Dwelling Older Adults and Their Informal Caregivers to Manage Pain and Related Situations
Authors: Mimi My Tse
Abstract:
The prevalence of chronic non-cancer pain is high among community-dwelling older adults. Pain affects physical and psychosocial abilities. Older adults tend to be less mobile and have a high tendency to fall risk. In addition, older adults with pain are depressed, anxious, and not too willing to join social activities. This will make them feel very lonely and social isolation. Instead of giving pain management education and programs to older adults/clients, both older adults and their caregivers, it is sad to find that the majority of existing services are given to older adults only. Given the importance of family members in increasing compliance with health-promoting programs, we proposed to offer pain management programs to both older adults with his/her caregiver as a “dyad.” We used the Health Promotion Model and implemented a dyadic pain management program (DPM). The DPM is an 8-week group-based program. The DPM comprises 4 weeks of center-based, face-to-face activities and 4 weeks of digital-based activities delivered via a WhatsApp group. There were 30 dyads (15 in the experimental group with DPM and 15 in the control group with pain education pamphlets). Upon the completion of DPM, pain intensity and pain interference were significantly lower in the intervention group as compared to the control group. At the same time, physical function showed significant improvement and lower depression scores in the intervention group. In conclusion, the study highlights the potential benefits of involving caregivers in the management of chronic pain for older adults. This approach should be widely promoted in managing chronic pain situations for community-dwelling older adults and their caregivers.Keywords: pain, older adults, dyadic approach, education
Procedia PDF Downloads 78