Search results for: functions of English language
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6494

Search results for: functions of English language

314 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector

Authors: Sanaz Moayer, Fang Huang, Scott Gardner

Abstract:

In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.

Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management

Procedia PDF Downloads 393
313 Bridging Educational Research and Policymaking: The Development of Educational Think Tank in China

Authors: Yumei Han, Ling Li, Naiqing Song, Xiaoping Yang, Yuping Han

Abstract:

Educational think tank is agreeably regarded as significant part of a nation’s soft power to promote the scientific and democratic level of educational policy making, and it plays critical role of bridging educational research in higher institutions and educational policy making. This study explores the concept, functions and significance of educational think tank in China, and conceptualizes a three dimensional framework to analyze the approaches of transforming research-based higher institutions into effective educational think tanks to serve educational policy making in the nation wide. Since 2014, the Ministry of Education P.R. China has been promoting the strategy of developing new type of educational think tanks in higher institutions, and such a strategy has been put into the agenda for the 13th Five Year Plan for National Education Development released in 2017.In such context, increasing scholars conduct studies to put forth strategies of promoting the development and transformation of new educational think tanks to serve educational policy making process. Based on literature synthesis, policy text analysis, and analysis of theories about policy making process and relationship between educational research and policy-making, this study constructed a three dimensional conceptual framework to address the following questions: (a) what are the new features of educational think tanks in the new era comparing traditional think tanks, (b) what are the functional objectives of the new educational think tanks, (c) what are the organizational patterns and mechanism of the new educational think tanks, (d) in what approaches traditional research-based higher institutions can be developed or transformed into think tanks to effectively serve the educational policy making process. The authors adopted case study approach on five influential education policy study centers affiliated with top higher institutions in China and applied the three dimensional conceptual framework to analyze their functional objectives, organizational patterns as well as their academic pathways that researchers use to contribute to the development of think tanks to serve education policy making process.Data was mainly collected through interviews with center administrators, leading researchers and academic leaders in the institutions. Findings show that: (a) higher institution based think tanks mainly function for multi-level objectives, providing evidence, theoretical foundations, strategies, or evaluation feedbacks for critical problem solving or policy-making on the national, provincial, and city/county level; (b) higher institution based think tanks organize various types of research programs for different time spans to serve different phases of policy planning, decision making, and policy implementation; (c) in order to transform research-based higher institutions into educational think tanks, the institutions must promote paradigm shift that promotes issue-oriented field studies, large data mining and analysis, empirical studies, and trans-disciplinary research collaborations; and (d) the five cases showed distinguished features in their way of constructing think tanks, and yet they also exposed obstacles and challenges such as independency of the think tanks, the discourse shift from academic papers to consultancy report for policy makers, weakness in empirical research methods, lack of experience in trans-disciplinary collaboration. The authors finally put forth implications for think tank construction in China and abroad.

Keywords: education policy-making, educational research, educational think tank, higher institution

Procedia PDF Downloads 145
312 Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: transformers, generative ai, gene expression design, classification

Procedia PDF Downloads 38
311 Destination Management Organization in the Digital Era: A Data Framework to Leverage Collective Intelligence

Authors: Alfredo Fortunato, Carmelofrancesco Origlia, Sara Laurita, Rossella Nicoletti

Abstract:

In the post-pandemic recovery phase of tourism, the role of a Destination Management Organization (DMO) as a coordinated management system of all the elements that make up a destination (attractions, access, marketing, human resources, brand, pricing, etc.) is also becoming relevant for local territories. The objective of a DMO is to maximize the visitor's perception of value and quality while ensuring the competitiveness and sustainability of the destination, as well as the long-term preservation of its natural and cultural assets, and to catalyze benefits for the local economy and residents. In carrying out the multiple functions to which it is called, the DMO can leverage a collective intelligence that comes from the ability to pool information, explicit and tacit knowledge, and relationships of the various stakeholders: policymakers, public managers and officials, entrepreneurs in the tourism supply chain, researchers, data journalists, schools, associations and committees, citizens, etc. The DMO potentially has at its disposal large volumes of data and many of them at low cost, that need to be properly processed to produce value. Based on these assumptions, the paper presents a conceptual framework for building an information system to support the DMO in the intelligent management of a tourist destination tested in an area of southern Italy. The approach adopted is data-informed and consists of four phases: (1) formulation of the knowledge problem (analysis of policy documents and industry reports; focus groups and co-design with stakeholders; definition of information needs and key questions); (2) research and metadatation of relevant sources (reconnaissance of official sources, administrative archives and internal DMO sources); (3) gap analysis and identification of unconventional information sources (evaluation of traditional sources with respect to the level of consistency with information needs, the freshness of information and granularity of data; enrichment of the information base by identifying and studying web sources such as Wikipedia, Google Trends, Booking.com, Tripadvisor, websites of accommodation facilities and online newspapers); (4) definition of the set of indicators and construction of the information base (specific definition of indicators and procedures for data acquisition, transformation, and analysis). The framework derived consists of 6 thematic areas (accommodation supply, cultural heritage, flows, value, sustainability, and enabling factors), each of which is divided into three domains that gather a specific information need to be represented by a scheme of questions to be answered through the analysis of available indicators. The framework is characterized by a high degree of flexibility in the European context, given that it can be customized for each destination by adapting the part related to internal sources. Application to the case study led to the creation of a decision support system that allows: •integration of data from heterogeneous sources, including through the execution of automated web crawling procedures for data ingestion of social and web information; •reading and interpretation of data and metadata through guided navigation paths in the key of digital story-telling; •implementation of complex analysis capabilities through the use of data mining algorithms such as for the prediction of tourist flows.

Keywords: collective intelligence, data framework, destination management, smart tourism

Procedia PDF Downloads 101
310 Ecological Planning Method of Reclamation Area Based on Ecological Management of Spartina Alterniflora: A Case Study of Xihu Harbor in Xiangshan County

Authors: Dong Yue, Hua Chen

Abstract:

The study region Xihu Harbor in Xiangshan County, Ningbo City is located in the central coast of Zhejiang Province. Concerning the wave dispating issue, Ningbo government firstly introduced Spartina alterniflora in 1980s. In the 1990s, S. alterniflora spread so rapidly thus a ‘grassland’ in the sea has been created nowadays. It has become the most important invasive plant of China’s coastal tidal flats. Although S. alterniflora had some ecological and economic functions, it has also brought series of hazards. It has ecological hazards on many aspects, including biomass and biodiversity, hydrodynamic force and sedimentation process, nutrient cycling of tidal flat, succession sequence of soil and plants and so on. On engineering, it courses problems of poor drainage and channel blocking. On economy, the hazard mainly reflected in the threat on aquaculture industry. The purpose of this study is to explore an ecological, feasible and economical way to manage Spartina alterniflora and use the land formed by it, taking Xihu Harbor in Xiangshan County as a case. Comparison method, mathematical modeling, qualitative and quantitative analysis are utilized to proceed the study. Main outcomes are as follows. By comparing a series of S. alterniflora managing methods which include the combination of mechanical cutting and hydraulic reclamation, waterlogging, herbicide and biological substitution from three standpoints – ecology, engineering and economy. It is inferred that the combination of mechanical cutting and hydraulic reclamation is among the top rank of S. alternifora managing methods. The combination of mechanical cutting and hydraulic reclamation means using large-scale mechanical equipment like large screw seagoing dredger to excavate the S. alterniflora with root and mud together. Then the mix of mud and grass was blown off nearby coastal tidal zone transported by pipelines, which can cushion the silt of tidal zone to form a land. However, as man-made land by coast, the reclamation area’s ecological sensitivity is quite high and will face high possibility of flood threat. Therefore, the reclamation area has many reasonability requirements, including ones on location, specific scope, water surface rate, direction of main watercourse, site of water-gate, the ratio of ecological land to urban construction land. These requirements all became important basis when the planning was being made. The water system planning, green space system planning, road structure and land use all need to accommodate the ecological requests. Besides, the profits from the formed land is the managing project’s source of funding, so how to utilize land efficiently is another considered point in the planning. It is concluded that by aiming at managing a large area of S. alterniflora, the combination of mechanical cutting and hydraulic reclamation is an ecological, feasible and economical method. The planning of reclamation area should fully respect the natural environment and possible disasters. Then the planning which makes land use efficient, reasonable, ecological will promote the development of the area’s city construction.

Keywords: ecological management, ecological planning method, reclamation area, Spartina alternifora, Xihu harbor

Procedia PDF Downloads 293
309 Utilizing Experiential Teaching Strategies to Reduce the Incidence of Falls in Patients in Orthopedic Wards

Authors: Yu-Shi Ye, Jia-Min Wu, Jhih-Ci Li

Abstract:

Background: Most orthopedic inpatients and primary caregivers are elderly, and patients are at high risk of falls. We set up a quality control team to analyze the root cause and found the following issues: 1. The nursing staff did not conduct cognitive assessments of patients and their primary caregivers to ensure that health education content was understood. 2. Nurses prefer to use spoken language in health education but lack the skills to use diverse teaching materials. 3. Newly recruited nurses have insufficient awareness of fall prevention. Methods: The study subjects were 16 nurses in the orthopedic ward of a teaching hospital in central Taiwan. We implemented the following strategies: 1. Developed a fall simulation teaching plan and conducted teaching courses and assessments in the morning meeting; 2. Designed and used a "fall prevention awareness card" to improve the prevention awareness of elderly patients; 3. All staff (including new staff) received experiential education training. Results: In 2021, 40% of patients in the orthopedic wards were aged 60-79 years (792/1979) with a high risk of falls. According to data collection, the incidence of falls in hospitalized patients was 0.04% (5/12651), which exceeded the threshold of 0.02% in our ward. After completing the on-the-job education training in October, the nursing staff expressed that they were more aware of the special situation of fall prevention. Through practical sharing and drills, combined with experiential teaching strategies, nurses can reconstruct the safety awareness of fall prevention and deepen their cognitive memory. Participants scored between 30 and 80 on the pretest (16 students, mean: 72.6) and between 90 and 100 on the post-test (16 students, mean: 92.6), resulting in a 73.8% improvement in overall scores. We have a total of 4 new employees who have all completed the first 3 months of compulsory PGY courses. From January to April 2022, the incidence of falls in hospitalized patients was 0.025% (1/3969). We have made good improvements and will continue to track the outcome. Discussion: In addition to enhancing the awareness of falls among nursing staff, how-to guide patients and primary caregivers to prevent falls is also the focus of improvement. The proper way of health education can be better understood through practical exercises and case sharing.

Keywords: experiential teaching strategies, fall prevention, cognitive card, elderly patients, orthopedic wards

Procedia PDF Downloads 39
308 Students' Performance, Perception and Attitude towards Interactive Online Modules to Improve Undergraduate Quantitative Skills in Biological Science

Authors: C. Suphioglu , V. Simbag, J. Markham, C. Coady, S. Belward, G. Di Trapani, P. Chunduri, J. Chuck, Y. Hodgson, L. Lluka, L. Poladian, D. Watters

Abstract:

Advances in science have made quantitative skills (QS) an essential graduate outcome for undergraduate science programs in Australia and other parts of the world. However, many students entering into degrees in Australian universities either lack these skills or have little confidence in their ability to apply them in their biological science units. It has been previously reported that integration of quantitative skills into life science programs appears to have a positive effect on student attitudes towards the importance of mathematics and statistics in biological sciences. It has also been noted that there is deficiency in QS resources available and applicable to undergraduate science students in Australia. MathBench (http://mathbench.umd.edu) is a series of online modules involving quantitative biology scenarios developed by the University of Maryland. Through collaboration with Australian universities, a project was funded by the Australian government through its Office for Learning and Teaching (OLT) to develop customized MathBench biology modules to promote the quantitative skills of undergraduate biology students in Australia. This presentation will focus on the assessment of changes in performance, perception and attitude of students in a third year Cellular Physiology unit after use of interactive online cellular diffusion modules modified for the Australian context. The modules have been designed to integrate QS into the biological science curriculum using familiar scenarios and informal language and providing students with the opportunity to review solutions to diffusion QS-related problems with interactive graphics. This paper will discuss results of pre and post MathBench quizzes composed of general and module specific questions that assessed change in student QS after MathBench; and pre and post surveys, administered before and after using MathBench modules to evaluate the students’ change in perception towards the influence of the modules, their attitude towards QS and on the development of their confidence in completing the inquiry-based activity as well as changes to their appreciation of the relevance of mathematics to cellular processes. Results will be compared to changes reported by Thompson et al., (2010) at the University of Maryland and implications for further integration of interactive online activities in the curriculum will be explored and discussed.

Keywords: quantitative skills, MathBench, maths in biology

Procedia PDF Downloads 365
307 Optical Imaging Based Detection of Solder Paste in Printed Circuit Board Jet-Printing Inspection

Authors: D. Heinemann, S. Schramm, S. Knabner, D. Baumgarten

Abstract:

Purpose: Applying solder paste to printed circuit boards (PCB) with stencils has been the method of choice over the past years. A new method uses a jet printer to deposit tiny droplets of solder paste through an ejector mechanism onto the board. This allows for more flexible PCB layouts with smaller components. Due to the viscosity of the solder paste, air blisters can be trapped in the cartridge. This can lead to missing solder joints or deviations in the applied solder volume. Therefore, a built-in and real-time inspection of the printing process is needed to minimize uncertainties and increase the efficiency of the process by immediate correction. The objective of the current study is the design of an optimal imaging system and the development of an automatic algorithm for the detection of applied solder joints from optical from the captured images. Methods: In a first approach, a camera module connected to a microcomputer and LED strips are employed to capture images of the printed circuit board under four different illuminations (white, red, green and blue). Subsequently, an improved system including a ring light, an objective lens, and a monochromatic camera was set up to acquire higher quality images. The obtained images can be divided into three main components: the PCB itself (i.e., the background), the reflections induced by unsoldered positions or screw holes and the solder joints. Non-uniform illumination is corrected by estimating the background using a morphological opening and subtraction from the input image. Image sharpening is applied in order to prevent error pixels in the subsequent segmentation. The intensity thresholds which divide the main components are obtained from the multimodal histogram using three probability density functions. Determining the intersections delivers proper thresholds for the segmentation. Remaining edge gradients produces small error areas which are removed by another morphological opening. For quantitative analysis of the segmentation results, the dice coefficient is used. Results: The obtained PCB images show a significant gradient in all RGB channels, resulting from ambient light. Using different lightings and color channels 12 images of a single PCB are available. A visual inspection and the investigation of 27 specific points show the best differentiation between those points using a red lighting and a green color channel. Estimating two thresholds from analyzing the multimodal histogram of the corrected images and using them for segmentation precisely extracts the solder joints. The comparison of the results to manually segmented images yield high sensitivity and specificity values. Analyzing the overall result delivers a Dice coefficient of 0.89 which varies for single object segmentations between 0.96 for a good segmented solder joints and 0.25 for single negative outliers. Conclusion: Our results demonstrate that the presented optical imaging system and the developed algorithm can robustly detect solder joints on printed circuit boards. Future work will comprise a modified lighting system which allows for more precise segmentation results using structure analysis.

Keywords: printed circuit board jet-printing, inspection, segmentation, solder paste detection

Procedia PDF Downloads 311
306 Sociocultural Context of Pain Management in Oncology and Palliative Nursing Care

Authors: Andrea Zielke-Nadkarni

Abstract:

Pain management is a question of quality of life and an indicator for nursing quality. Chronic pain which is predominant in oncology and palliative nursing situations is perceived today as a multifactorial, individual emotional experience with specific characteristics including the sociocultural dimension when dealing with migrant patients. This dimension of chronic pain is of major importance in professional nursing of migrant patients in hospices or palliative care units. Objectives of the study are: 1. To find out more about the sociocultural views on pain and nursing care, on customs and nursing practices connected with pain of both Turkish Muslim and German Christian women, 2. To improve individual and family oriented nursing practice with view to sociocultural needs of patients in severe pain in palliative care. In a qualitative-explorative comparative study 4 groups of women, Turkish Muslims immigrants (4 from the first generation, 5 from the second generation) and German Christian women of two generations (5 of each age group) of the same age groups as the Turkish women and with similar educational backgrounds were interviewed (semistructured ethnographic interviews using Spradley, 1979) on their perceptions and experiences of pain and nursing care within their families. For both target groups the presentation will demonstrate the following results in detail: Utterance of pain as well as “private” and “public” pain vary within different societies and cultures. Permitted forms of pain utterance are learned in childhood and determine attitudes and expectations in adulthood. Language, especially when metaphors and symbols are used, plays a major role for misunderstandings. The sociocultural context of illness may include specific beliefs that are important to the patients and yet seem more than far-fetched from a biomedical perspective. Pain can be an influential factor in family relationships where respect or hierarchies do not allow the direct utterance of individual needs. Specific resources are often, although not exclusively, linked to religious convictions and are significantly helpful in reducing pain. The discussion will evaluate the results of the study with view to the relevant literature and present nursing interventions and instruments beyond medication that are helpful when dealing with patients from various socio-cultural backgrounds in painful end-oflife situations.

Keywords: pain management, migrants, sociocultural context, palliative care

Procedia PDF Downloads 336
305 Impulsivity Leads to Compromise Effect

Authors: Sana Maidullah, Ankita Sharma

Abstract:

The present study takes naturalistic decision-making approach to examine the role of personality in information processing in consumer decision making. In the technological era, most of the information comes in form of HTML or similar language via the internet; processing of this situation could be ambiguous, laborious and painful. The present study explores the role of impulsivity in creating an extreme effect on consumer decision making. Specifically, the study explores the role of impulsivity in extreme effect, i.e., extremeness avoidance (compromise effect) and extremeness seeking; the role of demographic variables, i.e. age and gender, in the relation between impulsivity and extreme effect. The study was conducted with the help of a questionnaire and two experiments. The experiment was designed in the form of two shopping websites with two product types: Hotel choice and Mobile choice. Both experimental interfaces were created with the Xampp software, the frontend of interfaces was HTML CSS JAVASCRIPT and backend was PHP MySQL. The mobile experiment was designed to measure the extreme effect and hotel experiment was designed to measure extreme effect with alignability of attributes. To observe the possibilities of the combined effect of individual difference and context effects, the manipulation of price, a number of alignable attributes and number of the non-alignable attributes is done. The study was conducted on 100 undergraduate and post-graduate engineering students within the age range of 18-35. The familiarity and level of use of internet and shopping website were assessed and controlled in the analysis. The analysis was done by using a t-test, ANOVA and regression analysis. The results indicated that the impulsivity leads to compromise effect and at the same time it also increases the relationship between alignability of attribute among choices and the compromise effect. The demographic variables were found to play a significant role in the relationship. The subcomponents of impulsivity were significantly influencing compromise effect, but the cognitive impulsivity was significant for women, and motor impulsivity was significant for males only. The impulsivity was significantly positively predicted by age, though there were no significant gender differences in impulsivity. The results clearly indicate the importance of individual factors in decision making. The present study, with precise and direct results, provides a significant suggestion for market analyst and business providers.

Keywords: impulsivity, extreme effect, personality, alignability, consumer decision making

Procedia PDF Downloads 171
304 Pragmatic Development of Chinese Sentence Final Particles via Computer-Mediated Communication

Authors: Qiong Li

Abstract:

This study investigated in which condition computer-mediated communication (CMC) could promote pragmatic development. The focal feature included four Chinese sentence final particles (SFPs), a, ya, ba, and ne. They occur frequently in Chinese, and function as mitigators to soften the tone of speech. However, L2 acquisition of SFPs is difficult, suggesting the necessity of additional exposure to or explicit instruction on Chinese SFPs. This study follows this line and aims to explore two research questions: (1) Is CMC combined with data-driven instruction more effective than CMC alone in promoting L2 Chinese learners’ SFP use? (2) How does L2 Chinese learners’ SFP use change over time, as compared to the production of native Chinese speakers? The study involved 19 intermediate-level learners of Chinese enrolled at a private American university. They were randomly assigned to two groups: (1) the control group (N = 10), which was exposed to SFPs through CMC alone, (2) the treatment group (N = 9), which was exposed to SFPs via CMC and data-driven instruction. Learners interacted with native speakers on given topics through text-based CMC over Skype. Both groups went through six 30-minute CMC sessions on a weekly basis, with a one-week interval after the first two CMC sessions and a two-week interval after the second two CMC sessions (nine weeks in total). The treatment group additionally received a data-driven instruction after the first two sessions. Data analysis focused on three indices: token frequency, type frequency, and acceptability of SFP use. Token frequency was operationalized as the raw occurrence of SFPs per clause. Type frequency was the range of SFPs. Acceptability was rated by two native speakers using a rating rubric. The results showed that the treatment group made noticeable progress over time on the three indices. The production of SFPs approximated the native-like level. In contrast, the control group only slightly improved on token frequency. Only certain SFPs (a and ya) reached the native-like use. Potential explanations for the group differences were discussed in two aspects: the property of Chinese SFPs and the role of CMC and data-driven instruction. Though CMC provided the learners with opportunities to notice and observe SFP use, as a feature with low saliency, SFPs were not easily noticed in input. Data-driven instruction in the treatment group directed the learners’ attention to these particles, which facilitated the development.

Keywords: computer-mediated communication, data-driven instruction, pragmatic development, second language Chinese, sentence final particles

Procedia PDF Downloads 395
303 A Bioinspired Anti-Fouling Coating for Implantable Medical Devices

Authors: Natalie Riley, Anita Quigley, Robert M. I. Kapsa, George W. Greene

Abstract:

As the fields of medicine and bionics grow rapidly in technological advancement, the future and success of it depends on the ability to effectively interface between the artificial and the biological worlds. The biggest obstacle when it comes to implantable, electronic medical devices, is maintaining a ‘clean’, low noise electrical connection that allows for efficient sharing of electrical information between the artificial and biological systems. Implant fouling occurs with the adhesion and accumulation of proteins and various cell types as a result of the immune response to protect itself from the foreign object, essentially forming an electrical insulation barrier that often leads to implant failure over time. Lubricin (LUB) functions as a major boundary lubricant in articular joints, a unique glycoprotein with impressive anti-adhesive properties that self-assembles to virtually any substrate to form a highly ordered, ‘telechelic’ polymer brush. LUB does not passivate electroactive surfaces which makes it ideal, along with its innate biocompatibility, as a coating for implantable bionic electrodes. It is the aim of the study to investigate LUB’s anti-fouling properties and its potential as a safe, bioinspired material for coating applications to enhance the performance and longevity of implantable medical devices as well as reducing the frequency of implant replacement surgeries. Native, bovine-derived LUB (N-LUB) and recombinant LUB (R-LUB) were applied to gold-coated mylar surfaces. Fibroblast, chondrocyte and neural cell types were cultured and grown on the coatings under both passive and electrically stimulated conditions to test the stability and anti-adhesive property of the LUB coating in the presence of an electric field. Lactate dehydrogenase (LDH) assays were conducted as a directly proportional cell population count on each surface along with immunofluorescent microscopy to visualize cells. One-way analysis of variance (ANOVA) with post-hoc Tukey’s test was used to test for statistical significance. Under both passive and electrically stimulated conditions, LUB significantly reduced cell attachment compared to bare gold. Comparing the two coating types, R-LUB reduced cell attachment significantly compared to its native counterpart. Immunofluorescent micrographs visually confirmed LUB’s antiadhesive property, R-LUB consistently demonstrating significantly less attached cells for both fibroblasts and chondrocytes. Preliminary results investigating neural cells have so far demonstrated that R-LUB has little effect on reducing neural cell attachment; the study is ongoing. Recombinant LUB coatings demonstrated impressive anti-adhesive properties, reducing cell attachment in fibroblasts and chondrocytes. These findings and the availability of recombinant LUB brings into question the results of previous experiments conducted using native-derived LUB, its potential not adequately represented nor realized due to unknown factors and impurities that warrant further study. R-LUB is stable and maintains its anti-fouling property under electrical stimulation, making it suitable for electroactive surfaces.

Keywords: anti-fouling, bioinspired, cell attachment, lubricin

Procedia PDF Downloads 105
302 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating

Procedia PDF Downloads 173
301 Calpains; Insights Into the Pathogenesis of Heart Failure

Authors: Mohammadjavad Sotoudeheian

Abstract:

Heart failure (HF) prevalence, as a global cardiovascular problem, is increasing gradually. A variety of molecular mechanisms contribute to HF. Proteins involved in cardiac contractility regulation, such as ion channels and calcium handling proteins, are altered. Additionally, epigenetic modifications and gene expression can lead to altered cardiac function. Moreover, inflammation and oxidative stress contribute to HF. The progression of HF can be attributed to mitochondrial dysfunction that impairs energy production and increases apoptosis. Molecular mechanisms such as these contribute to the development of cardiomyocyte defects and HF and can be therapeutically targeted. The heart's contractile function is controlled by cardiomyocytes. Calpain, and its related molecules, including Bax, VEGF, and AMPK, are among the proteins involved in regulating cardiomyocyte function. Apoptosis is facilitated by Bax. Cardiomyocyte apoptosis is regulated by this protein. Furthermore, cardiomyocyte survival, contractility, wound healing, and proliferation are all regulated by VEGF, which is produced by cardiomyocytes during inflammation and cytokine stress. Cardiomyocyte proliferation and survival are also influenced by AMPK, an enzyme that plays an active role in energy metabolism. They all play key roles in apoptosis, angiogenesis, hypertrophy, and metabolism during myocardial inflammation. The role of calpains has been linked to several molecular pathways. The calpain pathway plays an important role in signal transduction and apoptosis, as well as autophagy, endocytosis, and exocytosis. Cell death and survival are regulated by these calcium-dependent cysteine proteases that cleave proteins. As a result, protein fragments can be used for various cellular functions. By cleaving adhesion and motility proteins, calcium proteins also contribute to cell migration. HF may be brought about by calpain-mediated pathways. Many physiological processes are mediated by the calpain molecular pathways. Signal transduction, cell death, and cell migration are all regulated by these molecular pathways. Calpain is activated by calcium binding to calmodulin. In the presence of calcium, calmodulin activates calpain. Calpains are stimulated by calcium, which increases matrix metalloproteinases (MMPs). In order to develop novel treatments for these diseases, we must understand how this pathway works. A variety of myocardial remodeling processes involve calpains, including remodeling of the extracellular matrix and hypertrophy of cardiomyocytes. Calpains also play a role in maintaining cardiac homeostasis through apoptosis and autophagy. The development of HF may be in part due to calpain-mediated pathways promoting cardiomyocyte death. Numerous studies have suggested the importance of the Ca2+ -dependent protease calpain in cardiac physiology and pathology. Therefore, it is important to consider this pathway to develop and test therapeutic options in humans that targets calpain in HF. Apoptosis, autophagy, endocytosis, exocytosis, signal transduction, and disease progression all involve calpain molecular pathways. Therefore, it is conceivable that calpain inhibitors might have therapeutic potential as they have been investigated in preclinical models of several conditions in which the enzyme has been implicated that might be treated with them. Ca 2+ - dependent proteases and calpains contribute to adverse ventricular remodeling and HF in multiple experimental models. In this manuscript, we will discuss the calpain molecular pathway's important roles in HF development.

Keywords: calpain, heart failure, autophagy, apoptosis, cardiomyocyte

Procedia PDF Downloads 50
300 The Impact of Spirituality on the Voluntary Simplicity Lifestyle Tendency: An Explanatory Study on Turkish Consumers

Authors: Esna B. Buğday, Niray Tunçel

Abstract:

Spirituality has a motivational influence on consumers' psychological states, lifestyles, and behavioral intentions. Spirituality refers to the feeling that there is a divine power greater than ourselves and a connection among oneself, others, nature, and the sacred. In addition, spirituality concerns the human soul and spirit against the material and physical world and consists of three dimensions: self-discovery, relationships, and belief in a higher power. Of them, self-discovery is to explore the meaning and the purpose of life. Relationships refer to the awareness of the connection between human beings and nature as well as respect for them. In addition, higher power represents the transcendent aspect of spirituality, which means to believe in a holy power that creates all the systems in the universe. Furthermore, a voluntary simplicity lifestyle is (1) to adopt a simple lifestyle by minimizing the attachment to and the consumption of material things and possessions, (2) to have an ecological awareness respecting all living creatures, and (3) to express the desire for exploring and developing the inner life. Voluntary simplicity is a multi-dimensional construct that consists of a desire for a voluntarily simple life (e.g., avoiding excessive consumption), cautious attitudes in shopping (e.g., not buying unnecessary products), acceptance of self-sufficiency (e.g., being self-sufficient individual), and rejection of highly developed functions of products (e.g., preference for simple functioned products). One of the main reasons for living simply is to sustain a spiritual life, as voluntary simplicity provides the space for achieving psychological and spiritual growth, cultivating self-reliance since voluntary simplifier frees themselves from the overwhelming externals and takes control of their daily lives. From this point of view, it is expected that people with a strong sense of spirituality will be likely to adopt a simple lifestyle. In this respect, the study aims to examine the impact of spirituality on consumers' voluntary simple lifestyle tendencies. As consumers' consumption attitudes and behaviors depend on their lifestyles, exploring the factors that lead them to embrace voluntary simplicity significantly predicts their purchase behavior. In this respect, this study presents empirical research based on a data set collected from 478 Turkish consumers through an online survey. First, exploratory factor analysis is applied to the data to reveal the dimensions of spirituality and voluntary simplicity scales. Second, confirmatory factor analysis is conducted to assess the measurement model. Last, the hypotheses are analyzed using partial least square structural equation modeling (PLS-SEM). The results confirm that spirituality's self-discovery and relationships dimensions positively impact both cautious attitudes in shopping and acceptance of self-sufficiency dimensions of voluntary simplicity. In contrast, belief in a higher power does not significantly influence consumers' voluntary simplicity tendencies. Even though there has been theoretical support drawing a positive relationship between spirituality and voluntary simplicity, to the best of the authors' knowledge, this has not been empirically tested in the literature before. Hence, this study contributes to the current knowledge by analyzing the direct influence of spirituality on consumers' voluntary simplicity tendencies. Additionally, analyzing this impact on the consumers of an emerging market is another contribution to the literature.

Keywords: spirituality, voluntary simplicity, self-sufficiency, conscious shopping, Turkish consumers

Procedia PDF Downloads 137
299 Adversarial Attacks and Defenses on Deep Neural Networks

Authors: Jonathan Sohn

Abstract:

Deep neural networks (DNNs) have shown state-of-the-art performance for many applications, including computer vision, natural language processing, and speech recognition. Recently, adversarial attacks have been studied in the context of deep neural networks, which aim to alter the results of deep neural networks by modifying the inputs slightly. For example, an adversarial attack on a DNN used for object detection can cause the DNN to miss certain objects. As a result, the reliability of DNNs is undermined by their lack of robustness against adversarial attacks, raising concerns about their use in safety-critical applications such as autonomous driving. In this paper, we focus on studying the adversarial attacks and defenses on DNNs for image classification. There are two types of adversarial attacks studied which are fast gradient sign method (FGSM) attack and projected gradient descent (PGD) attack. A DNN forms decision boundaries that separate the input images into different categories. The adversarial attack slightly alters the image to move over the decision boundary, causing the DNN to misclassify the image. FGSM attack obtains the gradient with respect to the image and updates the image once based on the gradients to cross the decision boundary. PGD attack, instead of taking one big step, repeatedly modifies the input image with multiple small steps. There is also another type of attack called the target attack. This adversarial attack is designed to make the machine classify an image to a class chosen by the attacker. We can defend against adversarial attacks by incorporating adversarial examples in training. Specifically, instead of training the neural network with clean examples, we can explicitly let the neural network learn from the adversarial examples. In our experiments, the digit recognition accuracy on the MNIST dataset drops from 97.81% to 39.50% and 34.01% when the DNN is attacked by FGSM and PGD attacks, respectively. If we utilize FGSM training as a defense method, the classification accuracy greatly improves from 39.50% to 92.31% for FGSM attacks and from 34.01% to 75.63% for PGD attacks. To further improve the classification accuracy under adversarial attacks, we can also use a stronger PGD training method. PGD training improves the accuracy by 2.7% under FGSM attacks and 18.4% under PGD attacks over FGSM training. It is worth mentioning that both FGSM and PGD training do not affect the accuracy of clean images. In summary, we find that PGD attacks can greatly degrade the performance of DNNs, and PGD training is a very effective way to defend against such attacks. PGD attacks and defence are overall significantly more effective than FGSM methods.

Keywords: deep neural network, adversarial attack, adversarial defense, adversarial machine learning

Procedia PDF Downloads 171
298 Cartography through Picasso’s Eyes

Authors: Desiree Di Marco

Abstract:

The aim of this work is to show through the lens of art first which kind of reality was the one represented through fascist maps, and second to study the impact of the fascist regime’s cartography (FRC) on observers eye’s. In this study, it is assumed that the FRC’s representation of reality was simplified, timeless, and even a-spatial because it underrates the concept of territoriality. Cubism and Picasso’s paintings will be used as counter-examples to mystify fascist cartography’s ideological assumptions. The difference between the gaze of an observer looking at the surface of a fascist map and the gaze of someone observing a Picasso painting is impressive. Because there is always something dark, hidden, behind and inside a map, the world of fascist maps was a world built starting from the observation of a “window” that distorted reality and trapped the eyes of the observers. Moving across the map, they seem as if they were hypnotized. Cartohypnosis is the state in which the observer finds himself enslaved by the attractive force of the map, which uses a sort of “magic” geography, a geography that, by means of symbolic language, never has as its primary objective the attempt to show us reality in its complexity, but that of performing for its audience. Magical geography and hypnotic cartography in fascism blended together, creating an almost mystical, magical relationship that demystified reality to reduce the world to a conquerable space. This reduction offered the observer the possibility of conceiving new dimensions: of the limit, of the boundary, elements with which the subject felt fully involved and in which the aesthetic force of the images demonstrated all its strength. But in the early 20th century, the combination of art and cartography gave rise to new possibilities. Cubism which, more than all the other artistic currents showed us how much the observation of reality from a single point of view falls within dangerous logic, is an example. Cubism was an artistic movement that brought about a profound transformation in pictorial culture. It was not only a revolution of pictorial space, but it was a revolution of our conception of pictorial space. Up until that time, men and women were more inclined to believe in the power of images and their representations. Cubist painters rebelled against this blindness by claiming that art must always offer an alternative. Indeed the contribution of this work is precisely to show how art can be able to provide alternatives to even the most horrible regimes and the most atrocious human misfortunes. It also enriches the field of cartography because it "reassures" it by showing how much good it can be for cartography if also for other disciplines come close. Only in this way researcher can increase the chances for the cartography of a greater diffusion at the academic level.

Keywords: cartography, Picasso, fascism, culture

Procedia PDF Downloads 45
297 The Role of Artificial Intelligence in Creating Personalized Health Content for Elderly People: A Systematic Review Study

Authors: Mahnaz Khalafehnilsaz, Rozina Rahnama

Abstract:

Introduction: The elderly population is growing rapidly, and with this growth comes an increased demand for healthcare services. Artificial intelligence (AI) has the potential to revolutionize the delivery of healthcare services to the elderly population. In this study, the various ways in which AI is used to create health content for elderly people and its transformative impact on the healthcare industry will be explored. Method: A systematic review of the literature was conducted to identify studies that have investigated the role of AI in creating health content specifically for elderly people. Several databases, including PubMed, Scopus, and Web of Science, were searched for relevant articles published between 2000 and 2022. The search strategy employed a combination of keywords related to AI, personalized health content, and the elderly. Studies that utilized AI to create health content for elderly individuals were included, while those that did not meet the inclusion criteria were excluded. A total of 20 articles that met the inclusion criteria were identified. Finding: The findings of this review highlight the diverse applications of AI in creating health content for elderly people. One significant application is the use of natural language processing (NLP), which involves the creation of chatbots and virtual assistants capable of providing personalized health information and advice to elderly patients. AI is also utilized in the field of medical imaging, where algorithms analyze medical images such as X-rays, CT scans, and MRIs to detect diseases and abnormalities. Additionally, AI enables the development of personalized health content for elderly patients by analyzing large amounts of patient data to identify patterns and trends that can inform healthcare providers in developing tailored treatment plans. Conclusion: AI is transforming the healthcare industry by providing a wide range of applications that can improve patient outcomes and reduce healthcare costs. From creating chatbots and virtual assistants to analyzing medical images and developing personalized treatment plans, AI is revolutionizing the way healthcare is delivered to elderly patients. Continued investment in this field is essential to ensure that elderly patients receive the best possible care.

Keywords: artificial intelligence, health content, older adult, healthcare

Procedia PDF Downloads 43
296 AI Predictive Modeling of Excited State Dynamics in OPV Materials

Authors: Pranav Gunhal., Krish Jhurani

Abstract:

This study tackles the significant computational challenge of predicting excited state dynamics in organic photovoltaic (OPV) materials—a pivotal factor in the performance of solar energy solutions. Time-dependent density functional theory (TDDFT), though effective, is computationally prohibitive for larger and more complex molecules. As a solution, the research explores the application of transformer neural networks, a type of artificial intelligence (AI) model known for its superior performance in natural language processing, to predict excited state dynamics in OPV materials. The methodology involves a two-fold process. First, the transformer model is trained on an extensive dataset comprising over 10,000 TDDFT calculations of excited state dynamics from a diverse set of OPV materials. Each training example includes a molecular structure and the corresponding TDDFT-calculated excited state lifetimes and key electronic transitions. Second, the trained model is tested on a separate set of molecules, and its predictions are rigorously compared to independent TDDFT calculations. The results indicate a remarkable degree of predictive accuracy. Specifically, for a test set of 1,000 OPV materials, the transformer model predicted excited state lifetimes with a mean absolute error of 0.15 picoseconds, a negligible deviation from TDDFT-calculated values. The model also correctly identified key electronic transitions contributing to the excited state dynamics in 92% of the test cases, signifying a substantial concordance with the results obtained via conventional quantum chemistry calculations. The practical integration of the transformer model with existing quantum chemistry software was also realized, demonstrating its potential as a powerful tool in the arsenal of materials scientists and chemists. The implementation of this AI model is estimated to reduce the computational cost of predicting excited state dynamics by two orders of magnitude compared to conventional TDDFT calculations. The successful utilization of transformer neural networks to accurately predict excited state dynamics provides an efficient computational pathway for the accelerated discovery and design of new OPV materials, potentially catalyzing advancements in the realm of sustainable energy solutions.

Keywords: transformer neural networks, organic photovoltaic materials, excited state dynamics, time-dependent density functional theory, predictive modeling

Procedia PDF Downloads 84
295 Quantum Cum Synaptic-Neuronal Paradigm and Schema for Human Speech Output and Autism

Authors: Gobinathan Devathasan, Kezia Devathasan

Abstract:

Objective: To improve the current modified Broca-Wernicke-Lichtheim-Kussmaul speech schema and provide insight into autism. Methods: We reviewed the pertinent literature. Current findings, involving Brodmann areas 22, 46, 9,44,45,6,4 are based on neuropathology and functional MRI studies. However, in primary autism, there is no lucid explanation and changes described, whether neuropathology or functional MRI, appear consequential. Findings: We forward an enhanced model which may explain the enigma related to autism. Vowel output is subcortical and does need cortical representation whereas consonant speech is cortical in origin. Left lateralization is needed to commence the circuitry spin as our life have evolved with L-amino acids and left spin of electrons. A fundamental species difference is we are capable of three syllable-consonants and bi-syllable expression whereas cetaceans and songbirds are confined to single or dual consonants. The 4 key sites for speech are superior auditory cortex, Broca’s two areas, and the supplementary motor cortex. Using the Argand’s diagram and Reimann’s projection, we theorize that the Euclidean three dimensional synaptic neuronal circuits of speech are quantized to coherent waves, and then decoherence takes place at area 6 (spherical representation). In this quantum state complex, 3-consonant languages are instantaneously integrated and multiple languages can be learned, verbalized and differentiated. Conclusion: We postulate that evolutionary human speech is elevated to quantum interaction unlike cetaceans and birds to achieve the three consonants/bi-syllable speech. In classical primary autism, the sudden speech switches off and on noted in several cases could now be explained not by any anatomical lesion but failure of coherence. Area 6 projects directly into prefrontal saccadic area (8); and this further explains the second primary feature in autism: lack of eye contact. The third feature which is repetitive finger gestures, located adjacent to the speech/motor areas, are actual attempts to communicate with the autistic child akin to sign language for the deaf.

Keywords: quantum neuronal paradigm, cetaceans and human speech, autism and rapid magnetic stimulation, coherence and decoherence of speech

Procedia PDF Downloads 170
294 Hidro-IA: An Artificial Intelligent Tool Applied to Optimize the Operation Planning of Hydrothermal Systems with Historical Streamflow

Authors: Thiago Ribeiro de Alencar, Jacyro Gramulia Junior, Patricia Teixeira Leite

Abstract:

The area of the electricity sector that deals with energy needs by the hydroelectric in a coordinated manner is called Operation Planning of Hydrothermal Power Systems (OPHPS). The purpose of this is to find a political operative to provide electrical power to the system in a given period, with reliability and minimal cost. Therefore, it is necessary to determine an optimal schedule of generation for each hydroelectric, each range, so that the system meets the demand reliably, avoiding rationing in years of severe drought, and that minimizes the expected cost of operation during the planning, defining an appropriate strategy for thermal complementation. Several optimization algorithms specifically applied to this problem have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. An alternative to these challenges is the development of techniques for simulation optimization and more sophisticated and reliable, it can assist the planning of the operation. Thus, this paper presents the development of a computational tool, namely Hydro-IA for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique is Genetic Algorithm (GA) and programming language is Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The results with the Genetic Algorithms were compared with the optimization technique nonlinear programming (NLP). Tests were conducted with seven hydroelectric plants interconnected hydraulically with historical stream flow from 1953 to 1955. The results of comparison between the GA and NLP techniques shows that the cost of operating the GA becomes increasingly smaller than the NLP when the number of hydroelectric plants interconnected increases. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.

Keywords: energy, optimization, hydrothermal power systems, artificial intelligence and genetic algorithms

Procedia PDF Downloads 396
293 Hand Movements and the Effect of Using Smart Teaching Aids: Quality of Writing Styles Outcomes of Pupils with Dysgraphia

Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Sajedah Al Yaari, Adham Al Yaari, Ayman Al Yaari, Montaha Al Yaari, Ayah Al Yaari, Fatehi Eissa

Abstract:

Dysgraphia is a neurological disorder of written expression that impairs writing ability and fine motor skills, resulting primarily in problems relating not only to handwriting but also to writing coherence and cohesion. We investigate the properties of smart writing technology to highlight some unique features of the effects they cause on the academic performance of pupils with dysgraphia. In Amis, dysgraphics undergo writing problems to express their ideas due to ordinary writing aids, as the default strategy. The Amis data suggests a possible connection between available writing aids and pupils’ writing improvement; therefore, texts’ expression and comprehension. A group of thirteen dysgraphic pupils were placed in a regular classroom of primary school, with twenty-one pupils being recruited in the study as a control group. To ensure validity, reliability and accountability to the research, both groups studied writing courses for two semesters, of which the first was equipped with smart writing aids while the second took place in an ordinary classroom. Two pre-tests were undertaken at the beginning of the first two semesters, and two post-tests were administered at the end of both semesters. Tests examined pupils’ ability to write coherent, cohesive and expressive texts. The dysgraphic group received the treatment of a writing course in the first semester in classes with smart technology and produced significantly greater increases in writing expression than in an ordinary classroom, and their performance was better than that of the control group in the second semester. The current study concludes that using smart teaching aids is a ‘MUST’, both for teaching and learning dysgraphia. Furthermore, it is demonstrated that for young dysgraphia, expressive tasks are more challenging than coherent and cohesive tasks. The study, therefore, supports the literature suggesting a role for smart educational aids in writing and that smart writing techniques may be an efficient addition to regular educational practices, notably in special educational institutions and speech-language therapeutic facilities. However, further research is needed to prompt the adults with dysgraphia more often than is done to the older adults without dysgraphia in order to get them to finish the other productive and/or written skills tasks.

Keywords: smart technology, writing aids, pupils with dysgraphia, hands’ movement

Procedia PDF Downloads 16
292 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 168
291 Sensory Ethnography and Interaction Design in Immersive Higher Education

Authors: Anna-Kaisa Sjolund

Abstract:

The doctoral thesis examines interaction design and sensory ethnography as tools to create immersive education environments. In recent years, there has been increasing interest and discussions among researchers and educators on immersive education like augmented reality tools, virtual glasses and the possibilities to utilize them in education at all levels. Using virtual devices as learning environments it is possible to create multisensory learning environments. Sensory ethnography in this study refers to the way of the senses consider the impact on the information dynamics in immersive learning environments. The past decade has seen the rapid development of virtual world research and virtual ethnography. Christine Hine's Virtual Ethnography offers an anthropological explanation of net behavior and communication change. Despite her groundbreaking work, time has changed the users’ communication style and brought new solutions to do ethnographical research. The virtual reality with all its new potential has come to the fore and considering all the senses. Movie and image have played an important role in cultural research for centuries, only the focus has changed in different times and in a different field of research. According to Karin Becker, the role of image in our society is information flow and she found two meanings what the research of visual culture is. The images and pictures are the artifacts of visual culture. Images can be viewed as a symbolic language that allows digital storytelling. Combining the sense of sight, but also the other senses, such as hear, touch, taste, smell, balance, the use of a virtual learning environment offers students a way to more easily absorb large amounts of information. It offers also for teachers’ different ways to produce study material. In this article using sensory ethnography as research tool approaches the core question. Sensory ethnography is used to describe information dynamics in immersive environment through interaction design. Immersive education environment is understood as three-dimensional, interactive learning environment, where the audiovisual aspects are central, but all senses can be taken into consideration. When designing learning environments or any digital service, interaction design is always needed. The question what is interaction design is justified, because there is no simple or consistent idea of what is the interaction design or how it can be used as a research method or whether it is only a description of practical actions. When discussing immersive learning environments or their construction, consideration should be given to interaction design and sensory ethnography.

Keywords: immersive education, sensory ethnography, interaction design, information dynamics

Procedia PDF Downloads 110
290 Embodied Communication - Examining Multimodal Actions in a Digital Primary School Project

Authors: Anne Öman

Abstract:

Today in Sweden and in other countries, a variety of digital artefacts, such as laptops, tablets, interactive whiteboards, are being used at all school levels. From an educational perspective, digital artefacts challenge traditional teaching because they provide a range of modes for expression and communication and are not limited to the traditional medium of paper. Digital technologies offer new opportunities for representations and physical interactions with objects, which put forward the role of the body in interaction and learning. From a multimodal perspective the emphasis is on the use of multiple semiotic resources for meaning- making and the study presented here has examined the differential use of semiotic resources by pupils interacting in a digitally designed task in a primary school context. The instances analyzed in this paper come from a case study where the learning task was to create an advertising film in a film-software. The study in focus involves the analysis of a single case with the emphasis on the examination of the classroom setting. The research design used in this paper was based on a micro ethnographic perspective and the empirical material was collected through video recordings of small-group work in order to explore pupils’ communication within the group activity. The designed task described here allowed students to build, share, collaborate upon and publish the redesigned products. The analysis illustrates the variety of communicative modes such as body position, gestures, visualizations, speech and the interaction between these modes and the representations made by the pupils. The findings pointed out the importance of embodied communication during the small- group processes from a learning perspective as well as a pedagogical understanding of pupils’ representations, which were similar from a cultural literacy perspective. These findings open up for discussions with further implications for the school practice concerning the small- group processes as well as the redesigned products. Wider, the findings could point out how multimodal interactions shape the learning experience in the meaning-making processes taking into account that language in a globalized society is more than reading and writing skills.

Keywords: communicative learning, interactive learning environments, pedagogical issues, primary school education

Procedia PDF Downloads 395
289 Posterior Cortical Atrophy Phenotype of Alzheimer’s Dementia: A Case Report

Authors: Joana Beyer

Abstract:

Background: Alzheimer’s disease (AD) is the predominant cause of dementia, characterized by progressive cognitive decline. Posterior cortical atrophy (PCA) is a less common variant of AD, primarily affecting younger individuals and presenting with visual, visuospatial, and visuoperceptual deficits, often leading to delayed diagnosis due to its atypical presentation. Case Presentation: We report the case of a 58-year-old woman referred to psychiatric services with a two-year history of progressive visuospatial decline, mild memory difficulties, and language impairments, notably anomia. Despite undergoing cataract and squint surgeries, her visual symptoms persisted, impacting her professional life as a music educator. The neuropsychological evaluation revealed profound visuoperceptual and visuospatial disturbances, with neuroimaging supporting a diagnosis of PCA. Treatment with Donepezil showed symptom improvement, highlighting the challenges and importance of early intervention and managing this atypical form of AD. Methods: The diagnostic process involved comprehensive physical, neuropsychological assessments, and neuroimaging, including MRI and F18 FDG PET CT, which demonstrated severe bilateral posterior cortical involvement. The case underscores the utility of these modalities in diagnosing PCA. Results: The initiation of Donepezil, an acetylcholinesterase inhibitor, resulted in symptom improvement, emphasizing the potential for AD treatments to benefit PCA patients. However, challenges in management, including treatment side effects and the necessity of multidisciplinary care, are discussed. Conclusion: This case highlights PCA's diagnostic challenges due to its atypical presentation and the broader implications for managing younger patients with early-onset dementia. It underscores the necessity for early recognition, comprehensive assessment, and tailored management strategies, including both pharmacological and non-pharmacological interventions, to improve patients' quality of life. Additionally, the case illustrates the need for expanding community memory services to accommodate younger patients with atypical forms of dementia, advocating for a more inclusive approach to dementia care.

Keywords: Alzheimer’s disease, posterior cortical atrophy, dementia, diagnosis, management, donepezil, early-onset dementia

Procedia PDF Downloads 40
288 Ikat: Undaunted Journey of a Traditional Textile Practice, a Sublime Connect of Traditionality with Modernity and Calibration for Eco-Sustainable Options

Authors: Purva Khurana

Abstract:

Traditional textile crafts are universally found to have been significantly impeded by the uprise of innovative technologies, but sustained human endeavor, in sync with dynamic market nuances, holds key to these otherwise getting fast-extinct marvels. The metamorphosis of such art-forms into niche markets pre-supposes sharp concentration on adaptability. The author has concentrated on the ancient handicraft of Ikat in Andhra Pradesh (India), a manifestation of their cultural heritage and esoteric cottage industry, so very intrinsic to the development and support of local economy and identity. Like any other traditional practice, ikat weaving has been subjected to the challenges of modernization. However, owing to its unique character, personalize production and adaptability, both of material and process, ikat weaving has stood the test of time by way of judiciously embellishing innovation with contemporary taste. To survive as a living craft as also to justify its role as a universal language of aesthetic sensibility, it is imperative that ikat tradition should lend itself continuous process of experiments, change and growth. Besides, the instant paper aims to examine the contours of ikat production process from its pure form, to more fashion and market oriented production, with upgraded process, material and tools. Over the time, it has adapted well to new style-paradigms, duly matching up with the latest fashion trends, in tandem with the market-sensitivities. Apart, it is an effort to investigate how this craft could respond constructively to the pressure of contemporary technical developments in order to be at cutting edge, while preserving its integrity. In order to approach these issues, the methodology adopted is, conceptual analysis of the craft practices, its unique strength and how they could be used to advance the craft in relation to the emergence of technical developments. The paper summarizes the result of the study carried out by the author on the peculiar advantages of suitably- calibrated vat dyes over natural dyes, in terms of its recycling ability and eco-friendly properties, thus holding definite edge, both in terms of socio-economic as well as environmental concerns.

Keywords: craft, eco-friendly dyes, ikat, metamorphosis

Procedia PDF Downloads 148
287 The Regulation of the Cancer Epigenetic Landscape Lies in the Realm of the Long Non-coding RNAs

Authors: Ricardo Alberto Chiong Zevallos, Eduardo Moraes Rego Reis

Abstract:

Pancreatic adenocarcinoma (PDAC) patients have a less than 10% 5-year survival rate. PDAC has no defined diagnostic and prognostic biomarkers. Gemcitabine is the first-line drug in PDAC and several other cancers. Long non-coding RNAs (lncRNAs) contribute to the tumorigenesis and are potential biomarkers for PDAC. Although lncRNAs aren’t translated into proteins, they have important functions. LncRNAs can decoy or recruit proteins from the epigenetic machinery, act as microRNA sponges, participate in protein translocation through different cellular compartments, and even promote chemoresistance. The chromatin remodeling enzyme EZH2 is a histone methyltransferase that catalyzes the methylation of histone 3 at lysine 27, silencing local expression. EZH2 is ambivalent, it can also activate gene expression independently of its histone methyltransferase activity. EZH2 is overexpressed in several cancers and interacts with lncRNAs, being recruited to a specific locus. EZH2 can be recruited to activate an oncogene or silence a tumor suppressor. The lncRNAs misregulation in cancer can result in the differential recruitment of EZH2 and in a distinct epigenetic landscape, promoting chemoresistance. The relevance of the EZH2-lncRNAs interaction to chemoresistant PDAC was assessed by Real Time quantitative PCR (RT-qPCR) and RNA Immunoprecipitation (RIP) experiments with naïve and gemcitabine-resistant PDAC cells. The expression of several lncRNAs and EZH2 gene targets was evaluated contrasting naïve and resistant cells. Selection of candidate genes was made by bioinformatic analysis and literature curation. Indeed, the resistant cell line showed higher expression of chemoresistant-associated lncRNAs and protein coding genes. RIP detected lncRNAs interacting with EZH2 with varying intensity levels in the cell lines. During RIP, the nuclear fraction of the cells was incubated with an antibody for EZH2 and with magnetic beads. The RNA precipitated with the beads-antibody-EZH2 complex was isolated and reverse transcribed. The presence of candidate lncRNAs was detected by RT-qPCR, and the enrichment was calculated relative to INPUT (total lysate control sample collected before RIP). The enrichment levels varied across the several lncRNAs and cell lines. The EZH2-lncRNA interaction might be responsible for the regulation of chemoresistance-associated genes in multiple cancers. The relevance of the lncRNA-EZH2 interaction to PDAC was assessed by siRNA knockdown of a lncRNA, followed by the analysis of the EZH2 target expression by RT-qPCR. The chromatin immunoprecipitation (ChIP) of EZH2 and H3K27me3 followed by RT-qPCR with primers for EZH2 targets also assess the specificity of the EZH2 recruitment by the lncRNA. This is the first report of the interaction of EZH2 and lncRNAs HOTTIP and PVT1 in chemoresistant PDAC. HOTTIP and PVT1 were described as promoting chemoresistance in several cancers, but the role of EZH2 is not clarified. For the first time, the lncRNA LINC01133 was detected in a chemoresistant cancer. The interaction of EZH2 with LINC02577, LINC00920, LINC00941, and LINC01559 have never been reported in any context. The novel lncRNAs-EZH2 interactions regulate chemoresistant-associated genes in PDAC and might be relevant to other cancers. Therapies targeting EZH2 alone weren’t successful, and a combinatorial approach also targeting the lncRNAs interacting with it might be key to overcome chemoresistance in several cancers.

Keywords: epigenetics, chemoresistance, long non-coding RNAs, pancreatic cancer, histone modification

Procedia PDF Downloads 67
286 Awareness Creation of Benefits of Antitrypsin-Free Nutraceutical Biopowder for Increasing Human Serum Albumin Synthesis as Possible Adjunct for Management of MDRTB or MDRTB-HIV Patients

Authors: Vincent Oghenekevbe Olughor, Olusoji Mayowa Ige

Abstract:

Except for a preexisting liver disease and malnutrition, there are no predilections for low serum albumin (SA) levels in humans. At normal reference levels (4.0-6.0g/dl) SA is a universal marker for mortality and morbidity risks assessments where depletion by 1.0g/dl increases mortality risk by 137% and morbidity by 89%.It has 40 known functions contributing significantly to the sustenance of human life. A depletion in SA to <2.2g/dl, in most clinical settings worldwide, leads to loss of oncotic pressure of blood causing clinical manifestations of bipedal Oedema, in which the patients remain conscious. SA also contributes significantly to buffering of blood to a life-sustaining pH of 7.35-7.45. A drop in blood pH to <6.9 will lead to instant coma and death, which can occur after SA continues to deplete after manifestations of bipedal Oedema. In an intervention study conducted in 2014 following the discovery that “SA is depleted during malaria fever”, a Nutraceutical formulated for use as treatment adjunct to prevent SA depletions during malaria to <2.4g/dl after Efficacy testing was found to be satisfactory. There are five known types of Malaria caused by Apicomplexan parasites, Plasmodium: the most lethal being that caused by Plasmodium falciparum causing malignant tertian malaria, in which the fever was occurring every 48 hours coincides with the dumping of malaria-toxins (Hemozoin) into blood, causing contamination: blood must remain sterile. Other Apicomplexan parasites, Toxoplasma and Cryptosporidium, are opportunistic infections of HIV. Separate studies showed SA depletions in MDRTB (multidrug resistant TB), and MDRTB-HIV patients by the same mechanism discovered with malaria and such depletions will be further complicated whenever Apicomplexan parasitic infections co-exist. Both Apicomplexan parasites and the TB parasite belong to the Obligate-group of Parasites, which are parasites that replicate only inside its host; and most of them have capacities to over-consume host nutrients during parasitaemia. In MDRTB patients the body attempts repeatedly to prevent depletions in SA to critical levels in the presence of adequate nutrients and only for a while in MDRTB-HIV patients. These groups of patients will, therefore, benefit from the already tested Nutraceutical in malaria patients. The Nutraceutical bio-Powder was formulated (to BP 1988 specification) from twelve nature-based food-grade nutrients containing all dedicated nutrients for ensuring improved synthesis of Albumin by the liver. The Nutraceutical was administered daily for 38±2days in 23 children, in a prospective phase-2 clinical trial, and its impact on body weight and core blood parameters were documented at the start and end of efficacy testing period. Sixteen children who did not experience malaria-induced depletions of SA had significant SA increase; seven children who experienced malaria-induced depletions of SA had insignificant SA decrease. The Packed Cell Volume Percentage (PCV %), a measure of the Oxygen carrying capacity of blood and the amount of nutrients the body can absorb, increased in both groups. The total serum proteins (SA+ Globulins) increased or decreased within the continuum of normal. In conclusion, MDRTB and MDRTB-HIV patients will benefit from a variant of this Nutraceutical when used as treatment adjunct.

Keywords: antitrypsin-free Nutraceutical, apicomplexan parasites, no predilections for low serum albumin, toxoplasmosis

Procedia PDF Downloads 267
285 The Return of the Rejected Kings: A Comparative Study of Governance and Procedures of Standards Development Organizations under the Theory of Private Ordering

Authors: Olia Kanevskaia

Abstract:

Standardization has been in the limelight of numerous academic studies. Typically described as ‘any set of technical specifications that either provides or is intended to provide a common design for a product or process’, standards do not only set quality benchmarks for products and services, but also spur competition and innovation, resulting in advantages for manufacturers and consumers. Their contribution to globalization and technology advancement is especially crucial in the Information and Communication Technology (ICT) and telecommunications sector, which is also characterized by a weaker state-regulation and expert-based rule-making. Most of the standards developed in that area are interoperability standards, which allow technological devices to establish ‘invisible communications’ and to ensure their compatibility and proper functioning. This type of standard supports a large share of our daily activities, ranging from traffic coordination by traffic lights to the connection to Wi-Fi networks, transmission of data via Bluetooth or USB and building the network architecture for the Internet of Things (IoT). A large share of ICT standards is developed in the specialized voluntary platforms, commonly referred to as Standards Development Organizations (SDOs), which gather experts from various industry sectors, private enterprises, governmental agencies and academia. The institutional architecture of these bodies can vary from semi-public bodies, such as European Telecommunications Standards Institute (ETSI), to industry-driven consortia, such as the Internet Engineering Task Force (IETF). The past decades witnessed a significant shift of standard setting to those institutions: while operating independently from the states regulation, they offer a rather informal setting, which enables fast-paced standardization and places technical supremacy and flexibility of standards above other considerations. Although technical norms and specifications developed by such nongovernmental platforms are not binding, they appear to create significant regulatory impact. In the United States (US), private voluntary standards can be used by regulators to achieve their policy objectives; in the European Union (EU), compliance with harmonized standards developed by voluntary European Standards Organizations (ESOs) can grant a product a free-movement pass. Moreover, standards can de facto manage the functioning of the market when other regulative alternatives are not available. Hence, by establishing (potentially) mandatory norms, SDOs assume regulatory functions commonly exercised by States and shape their own legal order. The purpose of this paper is threefold: First, it attempts to shed some light on SDOs’ institutional architecture, focusing on private, industry-driven platforms and comparing their regulatory frameworks with those of formal organizations. Drawing upon the relevant scholarship, the paper then discusses the extent to which the formulation of technological standards within SDOs constitutes a private legal order, operating in the shadow of governmental regulation. Ultimately, this contribution seeks to advise whether a state-intervention in industry-driven standard setting is desirable, and whether the increasing regulatory importance of SDOs should be addressed in legislation on standardization.

Keywords: private order, standardization, standard-setting organizations, transnational law

Procedia PDF Downloads 147