Search results for: sustainable forest management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13603

Search results for: sustainable forest management

13 A Study of the Trap of Multi-Homing in Customers: A Comparative Case Study of Digital Payments

Authors: Shari S. C. Shang, Lynn S. L. Chiu

Abstract:

In the digital payment market, some consumers use only one payment wallet while many others play multi-homing with a variety of payment services. With the diffusion of new payment systems, we examined the determinants of the adoption of multi-homing behavior. This study aims to understand how a digital payment provider dynamically expands business touch points with cross-business strategies to enrich the digital ecosystem and avoid the trap of multi-homing in customers. By synthesizing platform ecosystem literature, we constructed a two-dimensional research framework with one determinant of user digital behavior from offline to online intentions and the other determinant of digital payment touch points from convenient accessibility to cross-business platforms. To explore on a broader scale, we selected 12 digital payments from 5 countries of UK, US, Japan, Korea, and Taiwan. With the interplays of user digital behaviors and payment touch points, we group the study cases into four types: (1) Channel Initiated: users originated from retailers with high access to in-store shopping with face-to-face guidance for payment adoption. Providers offer rewards for customer loyalty and secure the retailer’s efficient cash flow management. (2) Social Media Dependent: users usually are digital natives with high access to social media or the internet who shop and pay digitally. Providers might not own physical or online shops but are licensed to aggregate money flows through virtual ecosystems. (3) Early Life Engagement: digital banks race to capture the next generation from popularity to profitability. This type of payment aimed to give children a taste of financial freedom while letting parents track their spending. Providers are to capitalize on the digital payment and e-commerce boom and hold on to new customers into adulthood. (4) Traditional Banking: plastic credit cards are purposely designed as a control group to track the evolvement of business strategies in digital payments. Traditional credit card users may follow the bank’s digital strategy to land on different types of digital wallets or mostly keep using plastic credit cards. This research analyzed business growth models and inter-firms’ coopetition strategies of the selected cases. Results of the multiple case analysis reveal that channel initiated payments bundled rewards with retailer’s business discount for recurring purchases. They also extended other financial services, such as insurance, to fulfill customers’ new demands. Contrastively, social media dependent payments developed new usages and new value creation, such as P2P money transfer through network effects among the virtual social ties, while early life engagements offer virtual banking products to children who are digital natives but overlooked by incumbents. It has disrupted the banking business domains in preparation for the metaverse economy. Lastly, the control group of traditional plastic credit cards has gradually converted to a BaaS (banking as a service) model depending on customers’ preferences. The multi-homing behavior is not avoidable in digital payment competitions. Payment providers may encounter multiple waves of a multi-homing threat after a short period of success. A dynamic cross-business collaboration strategy should be explored to continuously evolve the digital ecosystems and allow users for a broader shopping experience and continual usage.

Keywords: digital payment, digital ecosystems, multihoming users, cross business strategy, user digital behavior intentions

Procedia PDF Downloads 160
12 Amino Acid Based Biodegradable Poly (Ester-Amide)s and Their Potential Biomedical Applications as Drug Delivery Containers and Antibacterial

Authors: Nino Kupatadze, Tamar Memanishvili, Natia Ochkhikidze, David Tugushi, Zaal Kokaia, Ramaz Katsarava

Abstract:

Amino acid-based Biodegradable poly(ester-amide)s (PEAs) have gained considerable interest as a promising materials for numerous biomedical applications. These polymers reveal a high biocompatibility and easily form small particles suitable for delivery various biological, as well as elastic bio-erodible films serving as matrices for constructing antibacterial coatings. In the present work we have demonstrated a potential of the PEAs for two applications: 1. cell therapy for stroke as vehicles for delivery and sustained release of growth factors, 2. bactericidal coating as prevention biofilm and applicable in infected wound management. Stroke remains the main cause of adult disability with limited treatment options. Although stem cell therapy is a promising strategy, it still requires improvement of cell survival, differentiation and tissue modulation. .Recently, microspheres (MPs) made of biodegradable polymers have gained significant attention for providing necessary support of transplanted cells. To investigate this strategy in the cell therapy of stroke, MPs loaded with transcription factors Wnt3A/BMP4 were prepared. These proteins have been shown to mediate the maturation of the cortical neurons. We have suggested that implantation of these materials could create a suitable microenvironment for implanted cells. Particles with spherical shape, porous surface, and 5-40 m in size (monitored by scanning electron microscopy) were made on the basis of the original PEA composed of adipic acid, L-phenylalanine and 1,4-butanediol. After 4 months transplantation of MPs in rodent brain, no inflammation was observed. Additionally, factors were successfully released from MPs and affected neuronal cell differentiation in in vitro. The in vivo study using loaded MPs is in progress. Another severe problem in biomedicine is prevention of surgical devices from biofilm formation. Antimicrobial polymeric coatings are most effective “shields” to protect surfaces/devices from biofilm formation. Among matrices for constructing the coatings preference should be given to bio-erodible polymers. Such types of coatings will play a role of “unstable seating” that will not allow bacteria to occupy the surface. In other words, bio-erodible coatings would be discomfort shelter for bacteria that along with releasing “killers of bacteria” should prevent the formation of biofilm. For this purpose, we selected an original biodegradable PEA composed of L-leucine, 1,6-hexanediol and sebacic acid as a bio-erodible matrix, and nanosilver (AgNPs) as a bactericidal agent (“killer of bacteria”). Such nanocomposite material is also promising in treatment of superficial wound and ulcer. The solubility of the PEA in ethanol allows to reduce AgNO3 to NPs directly in the solution, where the solvent served as a reductive agent, and the PEA served as NPs stabilizer. The photochemical reduction was selected as a basic method to form NPs. The obtained AgNPs were characterized by UV-spectroscopy, transmission electron microscope (TEM), and dynamic light scattering (DLS). According to the UV-data and TEM data the photochemical reduction resulted in spherical AgNPs with wide particle size distribution with a high contribution of the particles below 10 nm that are known as responsible for bactericidal activity of AgNPs. DLS study showed that average size of nanoparticles formed after photo-reduction in ethanol solution ranged within ca. 50 nm.

Keywords: biodegradable polymers, microparticles, nanocomposites, stem cell therapy, stroke

Procedia PDF Downloads 395
11 Climate Change Threats to UNESCO-Designated World Heritage Sites: Empirical Evidence from Konso Cultural Landscape, Ethiopia

Authors: Yimer Mohammed Assen, Abiyot Legesse Kura, Engida Esyas Dube, Asebe Regassa Debelo, Girma Kelboro Mensuro, Lete Bekele Gure

Abstract:

Climate change has posed severe threats to many cultural landscapes of UNESCO world heritage sites recently. The UNESCO State of Conservation (SOC) reports categorized flooding, temperature increment, and drought as threats to cultural landscapes. This study aimed to examine variations and trends of rainfall and temperature extreme events and their threats to the UNESCO-designated Konso Cultural Landscape in southern Ethiopia. The study used dense merged satellite-gauge station rainfall data (1981-2020) with spatial resolution of 4km by 4km and observed maximum and minimum temperature data (1987-2020). Qualitative data were also gathered from cultural leaders, local administrators, and religious leaders using structured interview checklists. The spatial patterns, coefficient of variation, standardized anomalies, trends, and magnitude of change of rainfall and temperature extreme events both at annual and seasonal levels were computed using the Mann-Kendall trend test and Sen’s slope estimator under the CDT package. The standard precipitation index (SPI) was also used to calculate drought severity, frequency, and trend maps. The data gathered from key informant interviews and focus group discussions were coded and analyzed thematically to complement statistical findings. Thematic areas that explain the impacts of extreme events on the cultural landscape were chosen for coding. The thematic analysis was conducted using Nvivo software. The findings revealed that rainfall was highly variable and unpredictable, resulting in extreme drought and flood. There were significant (P<0.05) increasing trends of heavy rainfall (R10mm and R20mm) and the total amount of rain on wet days (PRCPTOT), which might have resulted in flooding. The study also confirmed that absolute temperature extreme indices (TXx, TXn, and TNx) and the percentile-based temperature extreme indices (TX90p, TN90p, TX10p, and TN10P) showed significant (P<0.05) increasing trends which are signals for warming of the study area. The results revealed that the frequency as well as the severity of drought at 3-months (katana/hageya seasons) was more pronounced than the 12-months (annual) time scale. The highest number of droughts in 100 years is projected at a 3-months timescale across the study area. The findings also showed that frequent drought has led to loss of grasses which are used for making traditional individual houses and multipurpose communal houses (pafta), food insecurity, migration, loss of biodiversity, and commodification of stones from terrace. On the other hand, the increasing trends of rainfall extreme indices resulted in destruction of terraces, soil erosion, loss of life and damage of properties. The study shows that a persistent decline in farmland productivity, due to erratic and extreme rainfall and frequent drought occurrences, forced the local people to participate in non-farm activities and retreat from daily preservation and management of their landscape. Overall, the increasing rainfall and temperature extremes coupled with prevalence of drought are thought to have an impact on the sustainability of cultural landscape through disrupting the ecosystem services and livelihood of the community. Therefore, more localized adaptation and mitigation strategies to the changing climate are needed to maintain the sustainability of Konso cultural landscapes as a global cultural treasure and to strengthen the resilience of smallholder farmers.

Keywords: adaptation, cultural landscape, drought, extremes indices

Procedia PDF Downloads 26
10 Development of an Omaha System-Based Remote Intervention Program for Work-Related Musculoskeletal Disorders (WMSDs) Among Front-Line Nurses

Authors: Tianqiao Zhang, Ye Tian, Yanliang Yin, Yichao Tian, Suzhai Tian, Weige Sun, Shuhui Gong, Limei Tang, Ruoliang Tang

Abstract:

Introduction: Healthcare workers, especially the nurses all over the world, are highly vulnerable to work-related musculoskeletal disorders (WMSDs), experiencing high rates of neck, shoulder, and low back injuries, due to the unfavorable working conditions. To reduce WMSDs among nursing personnel, many workplace interventions have been developed and implemented. Unfortunately, the ongoing Covid-19 (SARS-CoV-2) pandemic has posed great challenges to the ergonomic practices and interventions in healthcare facilities, particularly the hospitals, since current Covid-19 mitigation measures, such as social distancing and working remotely, has substantially minimized in-person gatherings and trainings. On the other hand, hospitals throughout the world have been short-staffed, resulting in disturbance of shift scheduling and more importantly, the increased job demand among the available caregivers, particularly the doctors and nurses. With the latest development in communication technology, remote intervention measures have been developed as an alternative, without the necessity of in-person meetings. The Omaha System (OS) is a standardized classification system for nursing practices, including a problem classification system, an intervention system, and an outcome evaluation system. This paper describes the development of an OS-based ergonomic intervention program. Methods: First, a comprehensive literature search was performed among worldwide electronic databases, including PubMed, Web of Science, Cochrane Library, China National Knowledge Infrastructure (CNKI), between journal inception to May 2020, resulting in a total of 1,418 scientific articles. After two independent screening processes, the final knowledge pool included eleven randomized controlled trial studies to develop the draft of the intervention program with Omaha intervention subsystem as the framework. After the determination of sample size needed for statistical power and the potential loss to follow-up, a total of 94 nurses from eight clinical departments agreed to provide written, informed consent to participate in the study, which were subsequently assigned into two random groups (i.e., intervention vs. control). A subgroup of twelve nurses were randomly selected to participate in a semi-structured interview, during which their general understanding and awareness of musculoskeletal disorders and potential interventions was assessed. Then, the first draft was modified to reflect the findings from these interviews. Meanwhile, the tentative program schedule was also assessed. Next, two rounds of consultation were conducted among experts in nursing management, occupational health, psychology, and rehabilitation, to further adjust and finalize the intervention program. The control group had access to all the information and exercise modules at baseline, while an interdisciplinary research team was formed and supervised the implementation of the on-line intervention program through multiple social media groups. Outcome measures of this comparative study included biomechanical load assessed by the Quick Exposure Check and stresses due to awkward body postures. Results and Discussion: Modification to the draft included (1) supplementing traditional Chinese medicine practices, (2) adding the use of assistive patient handling equipment, and (3) revising the on-line training method. Information module should be once a week, lasting about 20 to 30 minutes, for a total of 6 weeks, while the exercise module should be 5 times a week, each lasting about 15 to 20 minutes, for a total of 6 weeks.

Keywords: ergonomic interventions, musculoskeletal disorders (MSDs), omaha system, nurses, Covid-19

Procedia PDF Downloads 182
9 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 150
8 Development Programmes Requirements for Managing and Supporting the Ever-Dynamic Job Roles of Middle Managers in Higher Education Institutions: The Espousal Demanded from Human Resources Department; Case Studies of a New University in United Kingdom

Authors: Mohamed Sameer Mughal, Andrew D. Ross, Damian J. Fearon

Abstract:

Background: The fast-paced changing landscape of UK Higher Education Institution (HEIs) is poised by changes and challenges affecting Middle Managers (MM) in their job roles. MM contribute to the success of HEIs by balancing the equilibrium and pass organization strategies from senior staff towards operationalization directives to junior staff. However, this study showcased from the data analyzed during the semi structured interviews; MM job role is becoming more complex due to changes and challenges creating colossal pressures and workloads in day-to-day working. Current development programmes provisions by Human Resources (HR) departments in such HEIs are not feasible, applicable, and matching the true essence and requirements of MM who suggest that programmes offered by HR are too generic to suit their precise needs and require tailor made espousal to work effectively in their pertinent job roles. Methodologies: This study aims to capture demands of MM Development Needs (DN) by means of a conceptual model as conclusive part of the research that is divided into 2 phases. Phase 1 initiated by carrying out 2 pilot interviews with a retired Emeritus status professor and HR programmes development coordinator. Key themes from the pilot and literature review subsidized into formulation of 22 set of questions (Kvale and Brinkmann) in form of interviewing questionnaire during qualitative data collection. Data strategy and collection consisted of purposeful sampling of 12 semi structured interviews (n=12) lasting approximately an hour for all participants. The MM interviewed were at faculty and departmental levels which included; deans (n=2), head of departments (n=4), subject leaders (n=2), and lastly programme leaders (n=4). Participants recruitment was carried out via emails and snowballing technique. The interviews data was transcribed (verbatim) and managed using Computer Assisted Qualitative Data Analysis using Nvivo ver.11 software. Data was meticulously analyzed using Miles and Huberman inductive approach of positivistic style grounded theory, whereby key themes and categories emerged from the rich data collected. The data was precisely coded and classified into case studies (Robert Yin); with a main case study, sub cases (4 classes of MM) and embedded cases (12 individual MMs). Major Findings: An interim conceptual model emerged from analyzing the data with main concepts that included; key performance indicators (KPI’s), HEI effectiveness and outlook, practices, processes and procedures, support mechanisms, student events, rules, regulations and policies, career progression, reporting/accountability, changes and challenges, and lastly skills and attributes. Conclusion: Dynamic elements affecting MM includes; increase in government pressures, student numbers, irrelevant development programmes, bureaucratic structures, transparency and accountability, organization policies, skills sets… can only be confronted by employing structured development programmes originated by HR that are not provided generically. Future Work: Stage 2 (Quantitative method) of the study plans to validate the interim conceptual model externally through fully completed online survey questionnaire (Bram Oppenheim) from external HEIs (n=150). The total sample targeted is 1500 MM. Author contribution focuses on enhancing management theory and narrow the gap between by HR and MM development programme provision.

Keywords: development needs (DN), higher education institutions (HEIs), human resources (HR), middle managers (MM)

Procedia PDF Downloads 232
7 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 131
6 Translation of Self-Inject Contraception Training Objectives Into Service Performance Outcomes

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Simeon Christian Chukwu, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: Health service providers are offered in-service training periodically to strengthen their ability to deliver services that are ethical, quality, timely and safe. Not all capacity-building courses have successfully resulted in intended service delivery outcomes because of poor training content, design, approach, and ambiance. The Delivering Innovations in Selfcare (DISC) project developed a Moment of Truth innovation, which is a proven training model focused on improving consumer/provider interaction that leads to an increase in the voluntary uptake of subcutaneous depot medroxyprogesterone acetate (DMPA-SC) self-injection among women who opt for injectable contraception. Methodology: Six months after training on a moment of truth (MoT) training manual, the project conducted two intensive rounds of qualitative data collection and triangulation that included provider, client, and community mobilizer interviews, facility observations, and routine program data collection. Respondents were sampled according to a convenience sampling approach, and data collected was analyzed using a codebook and Atlas-TI. Providers and clients were interviewed to understand their experience, perspective, attitude, and awareness about the DMPA-SC self-inject. Data were collected from 12 health facilities in three states – eight directly trained and four cascades trained. The research team members came together for a participatory analysis workshop to explore and interpret emergent themes. Findings: Quality-of-service delivery and performance outcomes were observed to be significantly better in facilities whose providers were trained directly trained by the DISC project than in sites that received indirect training through master trainers. Facilities that were directly trained recorded SI proportions that were twice more than in cascade-trained sites. Direct training comprised of full-day and standalone didactic and interactive sessions constructed to evoke commitment, passion and conviction as well as eliminate provider bias and misconceptions in providers by utilizing human interest stories and values clarification exercises. Sessions also created compelling arguments using evidence and national guidelines. The training also prioritized demonstration sessions, utilized job aids, particularly videos, strengthened empathetic counseling – allaying client fears and concerns about SI, trained on positioning self-inject first and side effects management. Role plays and practicum was particularly useful to enable providers to retain and internalize new knowledge. These sessions provided experiential learning and the opportunity to apply one's expertise in a supervised environment where supportive feedback is provided in real-time. Cascade Training was often a shorter and abridged form of MoT training that leveraged existing training already planned by master trainers. This training was held over a four-hour period and was less emotive, focusing more on foundational DMPA-SC knowledge such as a reorientation to DMPA-SC, comparison of DMPA-SC variants, counseling framework and skills, data reporting and commodity tracking/requisition – no facility practicums. Training on self-injection was not as robust, presumably because they were not directed at methods in the contraceptive mix that align with state/organizational sponsored objectives – in this instance, fostering LARC services. Conclusion: To achieve better performance outcomes, consideration should be given to providing training that prioritizes practice-based and emotive content. Furthermore, a firm understanding and conviction about the value training offers improve motivation and commitment to accomplish and surpass service-related performance outcomes.

Keywords: training, performance outcomes, innovation, family planning, contraception, DMPA-SC, self-care, self-injection.

Procedia PDF Downloads 85
5 Reassembling a Fragmented Border Landscape at Crossroads: Indigenous Rights, Rural Sustainability, Regional Integration and Post-Colonial Justice in Hong Kong

Authors: Chiu-Yin Leung

Abstract:

This research investigates a complex assemblage among indigenous identities, socio-political organization and national apparatus in the border landscape of post-colonial Hong Kong. This former British colony had designated a transient mode of governance in its New Territories and particularly the northernmost borderland in 1951-2012. With a discriminated system of land provisions for the indigenous villagers, the place has been inherited with distinctive village-based culture, historic monuments and agrarian practices until its sovereignty return into the People’s Republic of China. In its latest development imperatives by the national strategic planning, the frontier area of Hong Kong has been identified as a strategy site for regional economic integration in South China, with cross-border projects of innovation and technology zones, mega-transport infrastructure and inter-jurisdictional arrangement. Contemporary literature theorizes borders as the material and discursive production of territoriality, which manifest in state apparatus and the daily lives of its citizens and condense in the contested articulations of power, security and citizenship. Drawing on the concept of assemblage, this paper attempts to tract how the border regime and infrastructure in Hong Kong as a city are deeply ingrained in the everyday lived spaces of the local communities but also the changing urban and regional strategies across different longitudinal moments. Through an intensive ethnographic fieldwork among the borderland villages since 2008 and the extensive analysis of colonial archives, new development plans and spatial planning frameworks, the author navigates the genealogy of the border landscape in Ta Kwu Ling frontier area and its implications as the milieu for new state space, covering heterogeneous fields particularly in indigenous rights, heritage preservation, rural sustainability and regional economy. Empirical evidence suggests an apparent bias towards indigenous power and colonial representation in classifying landscape values and conserving historical monuments. Squatter and farm tenants are often deprived of property rights, statutory participation and livelihood option in the planning process. The postcolonial bureaucracies have great difficulties in mobilizing resources to catch up with the swift, political-first approach of the mainland counterparts. Meanwhile, the cultural heritage, lineage network and memory landscape are not protected altogether with any holistic view or collaborative effort across the border. The enactment of land resumption and compensation scheme is furthermore disturbed by lineage-based customary law, technocratic bureaucracy, intra-community conflicts and multi-scalar political mobilization. As many traces of colonial misfortune and tyranny have been whitewashed without proper management, the author argues that postcolonial justice is yet reconciled in this fragmented border landscape. The assemblage of border in mainstream representation has tended to oversimplify local struggles as a collective mist and setup a wider production of schizophrenia experiences in the discussion of further economic integration among Hong Kong and other mainland cities in the Pearl River Delta Region. The research is expected to shed new light on the theorizing of border regions and postcolonialism beyond Eurocentric perspectives. In reassembling the borderland experiences with other arrays in state governance, village organization and indigenous identities, the author also suggests an alternative epistemology in reconciling socio-spatial differences and opening up imaginaries for positive interventions.

Keywords: heritage conservation, indigenous communities, post-colonial borderland, regional development, rural sustainability

Procedia PDF Downloads 208
4 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 66
3 Acute Severe Hyponatremia in Patient with Psychogenic Polydipsia, Learning Disability and Epilepsy

Authors: Anisa Suraya Ab Razak, Izza Hayat

Abstract:

Introduction: The diagnosis and management of severe hyponatremia in neuropsychiatric patients present a significant challenge to physicians. Several factors contribute, including diagnostic shadowing and attributing abnormal behavior to intellectual disability or psychiatric conditions. Hyponatraemia is the commonest electrolyte abnormality in the inpatient population, ranging from mild/asymptomatic, moderate to severe levels with life-threatening symptoms such as seizures, coma and death. There are several documented fatal case reports in the literature of severe hyponatremia secondary to psychogenic polydipsia, often diagnosed only in autopsy. This paper presents a case study of acute severe hyponatremia in a neuropsychiatric patient with early diagnosis and admission to intensive care. Case study: A 21-year old Caucasian male with known epilepsy and learning disability was admitted from residential living with generalized tonic-clonic self-terminating seizures after refusing medications for several weeks. Evidence of superficial head injury was detected on physical examination. His laboratory data demonstrated mild hyponatremia (125 mmol/L). Computed tomography imaging of his brain demonstrated no acute bleed or space-occupying lesion. He exhibited abnormal behavior - restlessness, drinking water from bathroom taps, inability to engage, paranoia, and hypersexuality. No collateral history was available to establish his baseline behavior. He was loaded with intravenous sodium valproate and leveritircaetam. Three hours later, he developed vomiting and a generalized tonic-clonic seizure lasting forty seconds. He remained drowsy for several hours and regained minimal recovery of consciousness. A repeat set of blood tests demonstrated profound hyponatremia (117 mmol/L). Outcomes: He was referred to intensive care for peripheral intravenous infusion of 2.7% sodium chloride solution with two-hourly laboratory monitoring of sodium concentration. Laboratory monitoring identified dangerously rapid correction of serum sodium concentration, and hypertonic saline was switched to a 5% dextrose solution to reduce the risk of acute large-volume fluid shifts from the cerebral intracellular compartment to the extracellular compartment. He underwent urethral catheterization and produced 8 liters of urine over 24 hours. Serum sodium concentration remained stable after 24 hours of correction fluids. His GCS recovered to baseline after 48 hours with improvement in behavior -he engaged with healthcare professionals, understood the importance of taking medications, admitted to illicit drug use and drinking massive amounts of water. He was transferred from high-dependency care to ward level and was initiated on multiple trials of anti-epileptics before achieving seizure-free days two weeks after resolution of acute hyponatremia. Conclusion: Psychogenic polydipsia is often found in young patients with intellectual disability or psychiatric disorders. Patients drink large volumes of water daily ranging from ten to forty liters, resulting in acute severe hyponatremia with mortality rates as high as 20%. Poor outcomes are due to challenges faced by physicians in making an early diagnosis and treating acute hyponatremia safely. A low index of suspicion of water intoxication is required in this population, including patients with known epilepsy. Monitoring urine output proved to be clinically effective in aiding diagnosis. Early referral and admission to intensive care should be considered for safe correction of sodium concentration while minimizing risk of fatal complications e.g. central pontine myelinolysis.

Keywords: epilepsy, psychogenic polydipsia, seizure, severe hyponatremia

Procedia PDF Downloads 122
2 Evaluation of Academic Research Projects Using the AHP and TOPSIS Methods

Authors: Murat Arıbaş, Uğur Özcan

Abstract:

Due to the increasing number of universities and academics, the fund of the universities for research activities and grants/supports given by government institutions have increased number and quality of academic research projects. Although every academic research project has a specific purpose and importance, limited resources (money, time, manpower etc.) require choosing the best ones from all (Amiri, 2010). It is a pretty hard process to compare and determine which project is better such that the projects serve different purposes. In addition, the evaluation process has become complicated since there are more than one evaluator and multiple criteria for the evaluation (Dodangeh, Mojahed and Yusuff, 2009). Mehrez and Sinuany-Stern (1983) determined project selection problem as a Multi Criteria Decision Making (MCDM) problem. If a decision problem involves multiple criteria and objectives, it is called as a Multi Attribute Decision Making problem (Ömürbek & Kınay, 2013). There are many MCDM methods in the literature for the solution of such problems. These methods are AHP (Analytic Hierarchy Process), ANP (Analytic Network Process), TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluation), UTADIS (Utilities Additives Discriminantes), ELECTRE (Elimination et Choix Traduisant la Realite), MAUT (Multiattribute Utility Theory), GRA (Grey Relational Analysis) etc. Teach method has some advantages compared with others (Ömürbek, Blacksmith & Akalın, 2013). Hence, to decide which MCDM method will be used for solution of the problem, factors like the nature of the problem, types of choices, measurement scales, type of uncertainty, dependency among the attributes, expectations of decision maker, and quantity and quality of the data should be considered (Tavana & Hatami-Marbini, 2011). By this study, it is aimed to develop a systematic decision process for the grant support applications that are expected to be evaluated according to their scientific adequacy by multiple evaluators under certain criteria. In this context, project evaluation process applied by The Scientific and Technological Research Council of Turkey (TÜBİTAK) the leading institutions in our country, was investigated. Firstly in the study, criteria that will be used on the project evaluation were decided. The main criteria were selected among TÜBİTAK evaluation criteria. These criteria were originality of project, methodology, project management/team and research opportunities and extensive impact of project. Moreover, for each main criteria, 2-4 sub criteria were defined, hence it was decided to evaluate projects over 13 sub-criterion in total. Due to superiority of determination criteria weights AHP method and provided opportunity ranking great number of alternatives TOPSIS method, they are used together. AHP method, developed by Saaty (1977), is based on selection by pairwise comparisons. Because of its simple structure and being easy to understand, AHP is the very popular method in the literature for determining criteria weights in MCDM problems. Besides, the TOPSIS method developed by Hwang and Yoon (1981) as a MCDM technique is an alternative to ELECTRE method and it is used in many areas. In the method, distance from each decision point to ideal and to negative ideal solution point was calculated by using Euclidian Distance Approach. In the study, main criteria and sub-criteria were compared on their own merits by using questionnaires that were developed based on an importance scale by four relative groups of people (i.e. TUBITAK specialists, TUBITAK managers, academics and individuals from business world ) After these pairwise comparisons, weight of the each main criteria and sub-criteria were calculated by using AHP method. Then these calculated criteria’ weights used as an input in TOPSİS method, a sample consisting 200 projects were ranked on their own merits. This new system supported to opportunity to get views of the people that take part of project process including preparation, evaluation and implementation on the evaluation of academic research projects. Moreover, instead of using four main criteria in equal weight to evaluate projects, by using weighted 13 sub-criteria and decision point’s distance from the ideal solution, systematic decision making process was developed. By this evaluation process, new approach was created to determine importance of academic research projects.

Keywords: Academic projects, Ahp method, Research projects evaluation, Topsis method.

Procedia PDF Downloads 590
1 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 127