Search results for: language learning and teaching
162 Research Project on Learning Rationality in Strategic Behaviors: Interdisciplinary Educational Activities in Italian High Schools
Authors: Giovanna Bimonte, Luigi Senatore, Francesco Saverio Tortoriello, Ilaria Veronesi
Abstract:
The education process considers capabilities not only to be seen as a means to a certain end but rather as an effective purpose. Sen's capability approach challenges human capital theory, which sees education as an ordinary investment undertaken by individuals. A complex reality requires complex thinking capable of interpreting the dynamics of society's changes to be able to make decisions that can be rational for private, ethical and social contexts. Education is not something removed from the cultural and social context; it exists and is structured within it. In Italy, the "Mathematical High School Project" is a didactic research project is based on additional laboratory courses in extracurricular hours where mathematics intends to bring itself in a dialectical relationship with other disciplines as a cultural bridge between the two cultures, the humanistic and the scientific ones, with interdisciplinary educational modules on themes of strong impact in younger life. This interdisciplinary mathematics presents topics related to the most advanced technologies and contemporary socio-economic frameworks to demonstrate how mathematics is not only a key to reading but also a key to resolving complex problems. The recent developments in mathematics provide the potential for profound and highly beneficial changes in mathematics education at all levels, such as in socio-economic decisions. The research project is built to investigate whether repeated interactions can successfully promote cooperation among students as rational choice and if the skill, the context and the school background can influence the strategies choice and the rationality. A Laboratory on Game Theory as mathematical theory was conducted in the 4th year of the Mathematical High Schools and in an ordinary scientific high school of the Scientific degree program. Students played two simultaneous games of repeated Prisoner's Dilemma with an indefinite horizon, with two different competitors in each round; even though the competitors in each round will remain the same for the duration of the game. The results highlight that most of the students in the two classes used the two games with an immunization strategy against the risk of losing: in one of the games, they started by playing Cooperate, and in the other by the strategy of Compete. In the literature, theoretical models and experiments show that in the case of repeated interactions with the same adversary, the optimal cooperation strategy can be achieved by tit-for-tat mechanisms. In higher education, individual capacities cannot be examined independently, as conceptual framework presupposes a social construction of individuals interacting and competing, making individual and collective choices. The paper will outline all the results of the experimentation and the future development of the research.Keywords: game theory, interdisciplinarity, mathematics education, mathematical high school
Procedia PDF Downloads 74161 Training for Safe Tree Felling in the Forest with Symmetrical Collaborative Virtual Reality
Authors: Irene Capecchi, Tommaso Borghini, Iacopo Bernetti
Abstract:
One of the most common pieces of equipment still used today for pruning, felling, and processing trees is the chainsaw in forestry. However, chainsaw use highlights dangers and one of the highest rates of accidents in both professional and non-professional work. Felling is proportionally the most dangerous phase, both in severity and frequency, because of the risk of being hit by the plant the operator wants to cut down. To avoid this, a correct sequence of chainsaw cuts must be taught concerning the different conditions of the tree. Virtual reality (VR) makes it possible to virtually simulate chainsaw use without danger of injury. The limitations of the existing applications are as follow. The existing platforms are not symmetrical collaborative because the trainee is only in virtual reality, and the trainer can only see the virtual environment on a laptop or PC, and this results in an inefficient teacher-learner relationship. Therefore, most applications only involve the use of a virtual chainsaw, and the trainee thus cannot feel the real weight and inertia of a real chainsaw. Finally, existing applications simulate only a few cases of tree felling. The objectives of this research were to implement and test a symmetrical collaborative training application based on VR and mixed reality (MR) with the overlap between real and virtual chainsaws in MR. The research and training platform was developed for the Meta quest 2 head-mounted display. The research and training platform application is based on the Unity 3D engine, and Present Platform Interaction SDK (PPI-SDK) developed by Meta. PPI-SDK avoids the use of controllers and enables hand tracking and MR. With the combination of these two technologies, it was possible to overlay a virtual chainsaw with a real chainsaw in MR and synchronize their movements in VR. This ensures that the user feels the weight of the actual chainsaw, tightens the muscles, and performs the appropriate movements during the test allowing the user to learn the correct body posture. The chainsaw works only if the right sequence of cuts is made to felling the tree. Contact detection is done by Unity's physics system, which allows the interaction of objects that simulate real-world behavior. Each cut of the chainsaw is defined by a so-called collider, and the felling of the tree can only occur if the colliders are activated in the right order simulating a safe technique felling. In this way, the user can learn how to use the chainsaw safely. The system is also multiplayer, so the student and the instructor can experience VR together in a symmetrical and collaborative way. The platform simulates the following tree-felling situations with safe techniques: cutting the tree tilted forward, cutting the medium-sized tree tilted backward, cutting the large tree tilted backward, sectioning the trunk on the ground, and cutting branches. The application is being evaluated on a sample of university students through a special questionnaire. The results are expected to test both the increase in learning compared to a theoretical lecture and the immersive and telepresence of the platform.Keywords: chainsaw, collaborative symmetric virtual reality, mixed reality, operator training
Procedia PDF Downloads 107160 A Top-down vs a Bottom-up Approach on Lower Extremity Motor Recovery and Balance Following Acute Stroke: A Randomized Clinical Trial
Authors: Vijaya Kumar, Vidayasagar Pagilla, Abraham Joshua, Rakshith Kedambadi, Prasanna Mithra
Abstract:
Background: Post stroke rehabilitation are aimed to accelerate for optimal sensorimotor recovery, functional gain and to reduce long-term dependency. Intensive physical therapy interventions can enhance this recovery as experience-dependent neural plastic changes either directly act at cortical neural networks or at distal peripheral level (muscular components). Neuromuscular Electrical Stimulation (NMES), a traditional bottom-up approach, mirror therapy (MT), a relatively new top down approach have found to be an effective adjuvant treatment methods for lower extremity motor and functional recovery in stroke rehabilitation. However there is a scarcity of evidence to compare their therapeutic gain in stroke recovery.Aim: To compare the efficacy of neuromuscular electrical stimulation (NMES) and mirror therapy (MT) in very early phase of post stroke rehabilitation addressed to lower extremity motor recovery and balance. Design: observer blinded Randomized Clinical Trial. Setting: Neurorehabilitation Unit, Department of Physical Therapy, Tertiary Care Hospitals. Subjects: 32 acute stroke subjects with first episode of unilateral stroke with hemiparesis, referred for rehabilitation (onset < 3 weeks), Brunnstorm lower extremity recovery stages ≥3 and MMSE score more than 24 were randomized into two group [Group A-NMES and Group B-MT]. Interventions: Both the groups received eclectic approach to remediate lower extremity recovery which includes treatment components of Roods, Bobath and Motor learning approaches for 30 minutes a day for 6 days. Following which Group A (N=16) received 30 minutes of surface NMES training for six major paretic muscle groups (gluteus maximus and medius,quadriceps, hamstrings, tibialis anterior and gastrocnemius). Group B (N=16) was administered with 30 minutes of mirror therapy sessions to facilitate lower extremity motor recovery. Outcome measures: Lower extremity motor recovery, balance and activities of daily life (ADLs) were measured by Fugyl Meyer Assessment (FMA-LE), Berg Balance Scale (BBS), Barthel Index (BI) before and after intervention. Results: Pre Post analysis of either group across the time revealed statistically significant improvement (p < 0.001) for all the outcome variables for the either group. All parameters of NMES had greater change scores compared to MT group as follows: FMA-LE (25.12±3.01 vs. 23.31±2.38), BBS (35.12±4.61 vs. 34.68±5.42) and BI (40.00±10.32 vs. 37.18±7.73). Between the groups comparison of pre post values showed no significance with FMA-LE (p=0.09), BBS (p=0.80) and BI (p=0.39) respectively. Conclusion: Though either groups had significant improvement (pre to post intervention), none of them were superior to other in lower extremity motor recovery and balance among acute stroke subjects. We conclude that eclectic approach is an effective treatment irrespective of NMES or MT as an adjunct.Keywords: balance, motor recovery, mirror therapy, neuromuscular electrical stimulation, stroke
Procedia PDF Downloads 281159 Towards Better Integration: Qualitative Study on Perceptions of Russian-Speaking Immigrants in Australia
Authors: Oleg Shovkovyy
Abstract:
This research conducted in response to one of the most pressing questions on the agenda of many public administration offices around the world: “What could be done for better integration and assimilation of immigrants into hosting communities?” In author’s view, the answer could be suggested by immigrants themselves. They, often ‘bogged down in the past,’ snared by own idols and demons, perceive things differently, which, in turn, may result in their inability to integrate smoothly into hosting communities. Brief literature review suggests that perceptions of immigrants are completely neglected or something unsought in the current research on migrants, which, often, based on opinion polls by members of hosting communities themselves or superficial research data by various research organizations. Even those specimens that include voices of immigrants, unlikely to shed any additional light onto the problem simply because certain things are not made to speak out loud, especially to those in whose hands immigrants’ fate is (authorities). In this regard, this qualitative study, conducted by an insider to a few Russian-speaking communities, represents a unique opportunity for all stakeholders to look at the question of integration through the eyes of immigrants, from a different perspective and thus, makes research findings especially valuable for better understanding of the problem. Case study research employed ethnographic methods of gathering data where, approximately 200 Russian-speaking immigrants of first and second generations were closely observed by the Russian-speaking researcher in their usual setting, for eight months, and at different venues. The number of informal interviews with 27 key informants, with whom the researcher managed to establish a good rapport and who were keen enough to share their experiences voluntarily, were conducted. The field notes were taken at 14 locations (study sites) within the Brisbane region of Queensland, Australia. Moreover, all this time, researcher lived in dwelling of one of the immigrants and was an active participant in the social life (worship, picnics, dinners, weekend schools, concerts, cultural events, social gathering, etc.) of observed communities, whose members, to a large extent, belong to various religious lines of the Russian and Protestant Church. It was found that the majority of immigrants had experienced some discrimination in matters of hiring, employment, recognition of educational qualifications from home countries, and simply felt a sort of dislike from society in various everyday situations. Many noted complete absences or very limited state assistance in terms of employment, training, education, and housing. For instance, the Australian Government Department of Human Services not only does not stimulate job search but, on the contrary, encourages to refuse short-term works and employment. On the other hand, offered free courses on adaptation, and the English language proved to be ineffective and unpopular amongst immigrants. Many interviewees have reported overstated requirements for English proficiency and local work experience, whereas it was not critical for the given task or job. Based on the result of long-term monitoring, the researcher also had the courage to assert the negative and decelerating roles of immigrants’ communities, particularly religious communities, on processes of integration and assimilation. The findings suggest that governments should either change current immigration policies in the direction of their toughening or to take more proactive and responsible role in dealing with immigrant-related issues; for instance, increasing assistance and support to all immigrants and probably, paying more attention to and taking stake in managing and organizing lives of immigrants’ communities rather, simply leaving it all to chance.Keywords: Australia, immigration, integration, perceptions
Procedia PDF Downloads 220158 The Effectiveness of Therapeutic Exercise on Motor Skills and Attention of Male Students with Autism Spectrum Disorder
Authors: Masoume Pourmohamadreza-Tajrishi, Parviz Azadfallah
Abstract:
Autism spectrum disorders (ASD) involve myriad aberrant perceptual, cognitive, linguistic, and social behaviors. The term spectrum emphasizes that the disabilities associated with ASD fall on a continuum from relatively mild to severe. People with ASD may display stereotyped behaviors such as twirling, spinning objects, flapping the hands, and rocking. The individuals with ASD exhibit communication problems due to repetitive/restricted behaviors. Children with ASD who lack the motivation to learn, who do not enjoy physical challenges, or whose sensory perception results in confusing or unpleasant feedback from movement may not become sufficiently motivated to practice motor activities. As a result, they may show both a delay in developing certain motor skills. Additionally, attention is an important component of learning. As far as children with ASD have problems in joint attention, many education-based programs are needed to consider some aspects of attention and motor activities development for students with ASD. These programs focus on the basic movement skills that are crucial for the future development of the more complex skills needed in games, dance, sports, gymnastics, active play, and recreational physical activities. The purpose of the present research was to determine the effectiveness of therapeutic exercise on motor skills and attention of male students with ASD. This was an experimental study with a control group. The population consisted of 8-10 year-old male students with ASD and 30 subjects were selected randomly from an available center suitable for the children with ASD. They were evaluated by the Basic Motor Ability Test (BMAT) and Persian version of computerized Stroop color-word test and randomly assigned to an experimental and control group (15 students in per group). The experimental group participated in 16 therapeutic exercise sessions and received therapeutic exercise program (twice a week; each lasting for 45 minutes) designed based on the Spark motor program while the control group did not. All subjects were evaluated by BMAT and Stroop color-word test after the last session again. The collected data were analyzed by using multivariate analysis of covariance (MANCOVA). The results of MANCOVA showed that experimental and control groups had a significant difference in motor skills and at least one of the components of attention (correct responses, incorrect responses, no responses, the reaction time of congruent words and reaction time of incongruent words in the Stroop test). The findings showed that the therapeutic exercise had a significant effect on motor skills and all components of attention in students with ASD. We can conclude that the therapeutic exercise led to promote the motor skills and attention of students with ASD, so it is necessary to design or plan such programs for ASD students to prevent their communication or academic problems.Keywords: Attention, autism spectrum disorder, motor skills, therapeutic exercise
Procedia PDF Downloads 130157 Parents as a Determinant for Students' Attitudes and Intentions toward Higher Education
Authors: Anna Öqvist, Malin Malmström
Abstract:
Attaining a higher level of education has become an increasingly important prerequisite for people’s economic and social independence and mobility. Young people who do not pursue higher education are not as attractive as potential employees in the modern work environment. Although completing a higher education degree is not a guarantee for getting a job, it substantially increases the chances for employment and, consequently, the chances for a better life. Despite this, it’s a fact that in several regions in Sweden, fewer students are choosing to engage in higher education. Similar trends have been emphasized in, for instance, the US where high dropout patterns among young people have been noted. This is a threat to future employment and industry development in these regions because the future employment base for society is dependent upon students’ willingness to invest in higher education. Much of prior studies have focused on the role of parents’ involvement in their children’s’ school work and the positive influence parents involvement have on their children’s school performance. Parental influence on education in general has been a topic of interest among those concerned with optimal developmental and educational outcomes for children and youth in pre-, secondary- and high school. Across a range of studies, there has emerged a strong conclusion that parental influence on child and youths education generally benefits children's and youths learning and school success. Arguably then, we could expect that parents influence on whether or not to pursue a higher education would be of importance to understand young people’s choice to engage in higher education. Accordingly, understanding what drives students’ intentions to pursue higher education is an essential component of motivating students to aspire to make the most of their potential in their future work life. Drawing on the theory of planned behavior, this study examines the role of parents influence on students’ attitudes about whether higher education can be beneficial to their future work life. We used a qualitative approach by collecting interview data from 18 high school students in Sweden to capture students’ cognitive and motivational mechanisms (attitudes) to influence intentions to engage in higher education. We found that parents may positively or negatively influence students’ attitudes and subsequently a student's intention to pursue higher education. Accordingly, our results show that parents’ own attitudes and expectations on their children are keys for influencing students’ attitudes and intentions for higher education. Further, our finding illuminates the mechanisms that drive students in one direction or the other. As such, our findings show that the same categories of arguments are used for driving students’ attitudes and intentions in two opposite directions, namely; financial arguments and work life benefits arguments. Our results contribute to existing literature by showing that parents do affect young people’s intentions to engage in higher studies. The findings contribute to the theory of planned behavior and have implications for the literature on higher education and educational psychology and also provide guidance on how to inform students about facts of higher studies in school.Keywords: higher studies, intentions, parents influence, theory of planned behavior
Procedia PDF Downloads 257156 Sensor and Sensor System Design, Selection and Data Fusion Using Non-Deterministic Multi-Attribute Tradespace Exploration
Authors: Matthew Yeager, Christopher Willy, John Bischoff
Abstract:
The conceptualization and design phases of a system lifecycle consume a significant amount of the lifecycle budget in the form of direct tasking and capital, as well as the implicit costs associated with unforeseeable design errors that are only realized during downstream phases. Ad hoc or iterative approaches to generating system requirements oftentimes fail to consider the full array of feasible systems or product designs for a variety of reasons, including, but not limited to: initial conceptualization that oftentimes incorporates a priori or legacy features; the inability to capture, communicate and accommodate stakeholder preferences; inadequate technical designs and/or feasibility studies; and locally-, but not globally-, optimized subsystems and components. These design pitfalls can beget unanticipated developmental or system alterations with added costs, risks and support activities, heightening the risk for suboptimal system performance, premature obsolescence or forgone development. Supported by rapid advances in learning algorithms and hardware technology, sensors and sensor systems have become commonplace in both commercial and industrial products. The evolving array of hardware components (i.e. sensors, CPUs, modular / auxiliary access, etc…) as well as recognition, data fusion and communication protocols have all become increasingly complex and critical for design engineers during both concpetualization and implementation. This work seeks to develop and utilize a non-deterministic approach for sensor system design within the multi-attribute tradespace exploration (MATE) paradigm, a technique that incorporates decision theory into model-based techniques in order to explore complex design environments and discover better system designs. Developed to address the inherent design constraints in complex aerospace systems, MATE techniques enable project engineers to examine all viable system designs, assess attribute utility and system performance, and better align with stakeholder requirements. Whereas such previous work has been focused on aerospace systems and conducted in a deterministic fashion, this study addresses a wider array of system design elements by incorporating both traditional tradespace elements (e.g. hardware components) as well as popular multi-sensor data fusion models and techniques. Furthermore, statistical performance features to this model-based MATE approach will enable non-deterministic techniques for various commercial systems that range in application, complexity and system behavior, demonstrating a significant utility within the realm of formal systems decision-making.Keywords: multi-attribute tradespace exploration, data fusion, sensors, systems engineering, system design
Procedia PDF Downloads 183155 Constructing and Circulating Knowledge in Continuous Education: A Study of Norwegian Educational-Psychological Counsellors' Reflection Logs in Post-Graduate Education
Authors: Moen Torill, Rismark Marit, Astrid M. Solvberg
Abstract:
In Norway, every municipality shall provide an educational psychological service, EPS, to support kindergartens and schools in their work with children and youths with special needs. The EPS focus its work on individuals, aiming to identify special needs and to give advice to teachers and parents when they ask for it. In addition, the service also give priority to prevention and system intervention in kindergartens and schools. To master these big tasks university courses are established to support EPS counsellors' continuous learning. There is, however, a need for more in-depth and systematic knowledge on how they experience the courses they attend. In this study, EPS counsellors’ reflection logs during a particular course are investigated. The research question is: what are the content and priorities of the reflections that are communicated in the logs produced by the educational psychological counsellors during a post-graduate course? The investigated course is a credit course organized over a one-year period in two one-semester modules. The altogether 55 students enrolled in the course work as EPS counsellors in various municipalities across Norway. At the end of each day throughout the course period, the participants wrote reflection logs about what they had experienced during the day. The data material consists of 165 pages of typed text. The collaborating researchers studied the data material to ascertain, differentiate and understand the meaning of the content in each log. The analysis also involved the search for similarity in content and development of analytical categories that described the focus and primary concerns in each of the written logs. This involved constant 'critical and sustained discussions' for mutual construction of meaning between the co-researchers in the developing categories. The process is inspired by Grounded Theory. This means that the concepts developed during the analysis derived from the data material and not chosen prior to the investigation. The analysis revealed that the concept 'Useful' frequently appeared in the participants’ reflections and, as such, 'Useful' serves as a core category. The core category is described through three major categories: (1) knowledge sharing (concerning direct and indirect work with students with special needs) with colleagues is useful, (2) reflections on models and theoretical concepts (concerning students with special needs) are useful, (3) reflection on the role as EPS counsellor is useful. In all the categories, the notion of useful occurs in the participants’ emphasis on and acknowledgement of the immediate and direct link between the university course content and their daily work practice. Even if each category has an importance and value of its own, it is crucial that they are understood in connection with one another and as interwoven. It is the connectedness that gives the core category an overarching explanatory power. The knowledge from this study may be a relevant contribution when it comes to designing new courses that support continuing professional development for EPS counsellors, whether for post-graduate university courses or local courses at the EPS offices or whether in Norway or other countries in the world.Keywords: constructing and circulating knowledge, educational-psychological counsellor, higher education, professional development
Procedia PDF Downloads 115154 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 39153 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining
Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj
Abstract:
Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.Keywords: data mining, SME growth, success factors, web mining
Procedia PDF Downloads 267152 Multicultural Education in the National Context: A Study of Peoples' Friendship University of Russia
Authors: Maria V. Mishatkina
Abstract:
The modelling of dialogical environment is an essential feature of modern education. The dialogue of cultures is a foundation and an important prerequisite for a formation of a human’s main moral qualities such as an ability to understand another person, which is manifested in such values as tolerance, respect, mutual assistance and mercy. A formation of a modern expert occurs in an educational environment that is significantly different from what we had several years ago. Nowadays university education has qualitatively new characteristics. They may be observed in Peoples’ Friendship University of Russia (RUDN University), a top Russian higher education institution which unites representatives of more than 150 countries. The content of its educational strategies is not an adapted cultural experience but material between science and innovation. Besides, RUDN University’s profiles and specialization are not equal to the professional structures. People study not a profession in a strict sense but a basic scientific foundation of an activity in different socio-cultural areas (science, business and education). RUDN University also provides a considerable unit of professional education components. They are foreign languages skills, economic, political, ethnic, communication and computer culture, theory of information and basic management skills. Moreover, there is a rich social life (festive multicultural events, theme parties, journeys) and prospects concerning the inclusive approach to education (for example, a special course ‘Social Pedagogy: Issues of Tolerance’). In our research, we use such methods as analysis of modern and contemporary scientific literature, opinion poll (involving students, teachers and research workers) and comparative data analysis. We came to the conclusion that knowledge transfer of RUDN student in the activity happens through making goals, problems, issues, tasks and situations which simulate future innovative ambiguous environment that potentially prepares him/her to dialogical way of life. However, all these factors may not take effect if there is no ‘personal inspiration’ of students by communicative and dialogic values, their participation in a system of meanings and tools of learning activity that is represented by cooperation within the framework of scientific and pedagogical schools dialogue. We also found out that dominating strategies of ensuring the quality of education are those that put students in the position of the subject of their own education. Today these strategies and approaches should involve such approaches and methods as task, contextual, modelling, specialized, game-imitating and dialogical approaches, the method of practical situations, etc. Therefore, University in the modern sense is not only an educational institution, but also a generator of innovation, cooperation among nations and cultural progress. RUDN University has been performing exactly this mission for many decades.Keywords: dialogical developing situation, dialogue of cultures, readiness for dialogue, university graduate
Procedia PDF Downloads 220151 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection
Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa
Abstract:
Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.Keywords: classification, airborne LiDAR, parameters selection, support vector machine
Procedia PDF Downloads 147150 An Evidence-Based Laboratory Medicine (EBLM) Test to Help Doctors in the Assessment of the Pancreatic Endocrine Function
Authors: Sergio J. Calleja, Adria Roca, José D. Santotoribio
Abstract:
Pancreatic endocrine diseases include pathologies like insulin resistance (IR), prediabetes, and type 2 diabetes mellitus (DM2). Some of them are highly prevalent in the U.S.—40% of U.S. adults have IR, 38% of U.S. adults have prediabetes, and 12% of U.S. adults have DM2—, as reported by the National Center for Biotechnology Information (NCBI). Building upon this imperative, the objective of the present study was to develop a non-invasive test for the assessment of the patient’s pancreatic endocrine function and to evaluate its accuracy in detecting various pancreatic endocrine diseases, such as IR, prediabetes, and DM2. This approach to a routine blood and urine test is based around serum and urine biomarkers. It is made by the combination of several independent public algorithms, such as the Adult Treatment Panel III (ATP-III), triglycerides and glucose (TyG) index, homeostasis model assessment-insulin resistance (HOMA-IR), HOMA-2, and the quantitative insulin-sensitivity check index (QUICKI). Additionally, it incorporates essential measurements such as the creatinine clearance, estimated glomerular filtration rate (eGFR), urine albumin-to-creatinine ratio (ACR), and urinalysis, which are helpful to achieve a full image of the patient’s pancreatic endocrine disease. To evaluate the estimated accuracy of this test, an iterative process was performed by a machine learning (ML) algorithm, with a training set of 9,391 patients. The sensitivity achieved was 97.98% and the specificity was 99.13%. Consequently, the area under the receiver operating characteristic (AUROC) curve, the positive predictive value (PPV), and the negative predictive value (NPV) were 92.48%, 99.12%, and 98.00%, respectively. The algorithm was validated with a randomized controlled trial (RCT) with a target sample size (n) of 314 patients. However, 50 patients were initially excluded from the study, because they had ongoing clinically diagnosed pathologies, symptoms or signs, so the n dropped to 264 patients. Then, 110 patients were excluded because they didn’t show up at the clinical facility for any of the follow-up visits—this is a critical point to improve for the upcoming RCT, since the cost of each patient is very high and for this RCT almost a third of the patients already tested were lost—, so the new n consisted of 154 patients. After that, 2 patients were excluded, because some of their laboratory parameters and/or clinical information were wrong or incorrect. Thus, a final n of 152 patients was achieved. In this validation set, the results obtained were: 100.00% sensitivity, 100.00% specificity, 100.00% AUROC, 100.00% PPV, and 100.00% NPV. These results suggest that this approach to a routine blood and urine test holds promise in providing timely and accurate diagnoses of pancreatic endocrine diseases, particularly among individuals aged 40 and above. Given the current epidemiological state of these type of diseases, these findings underscore the significance of early detection. Furthermore, they advocate for further exploration, prompting the intention to conduct a clinical trial involving 26,000 participants (from March 2025 to December 2026).Keywords: algorithm, diabetes, laboratory medicine, non-invasive
Procedia PDF Downloads 33149 Data Science/Artificial Intelligence: A Possible Panacea for Refugee Crisis
Authors: Avi Shrivastava
Abstract:
In 2021, two heart-wrenching scenes, shown live on television screens across countries, painted a grim picture of refugees. One of them was of people clinging onto an airplane's wings in their desperate attempt to flee war-torn Afghanistan. They ultimately fell to their death. The other scene was the U.S. government authorities separating children from their parents or guardians to deter migrants/refugees from coming to the U.S. These events show the desperation refugees feel when they are trying to leave their homes in disaster zones. However, data paints a grave picture of the current refugee situation. It also indicates that a bleak future lies ahead for the refugees across the globe. Data and information are the two threads that intertwine to weave the shimmery fabric of modern society. Data and information are often used interchangeably, but they differ considerably. For example, information analysis reveals rationale, and logic, while data analysis, on the other hand, reveals a pattern. Moreover, patterns revealed by data can enable us to create the necessary tools to combat huge problems on our hands. Data analysis paints a clear picture so that the decision-making process becomes simple. Geopolitical and economic data can be used to predict future refugee hotspots. Accurately predicting the next refugee hotspots will allow governments and relief agencies to prepare better for future refugee crises. The refugee crisis does not have binary answers. Given the emotionally wrenching nature of the ground realities, experts often shy away from realistically stating things as they are. This hesitancy can cost lives. When decisions are based solely on data, emotions can be removed from the decision-making process. Data also presents irrefutable evidence and tells whether there is a solution or not. Moreover, it also responds to a nonbinary crisis with a binary answer. Because of all that, it becomes easier to tackle a problem. Data science and A.I. can predict future refugee crises. With the recent explosion of data due to the rise of social media platforms, data and insight into data has solved many social and political problems. Data science can also help solve many issues refugees face while staying in refugee camps or adopted countries. This paper looks into various ways data science can help solve refugee problems. A.I.-based chatbots can help refugees seek legal help to find asylum in the country they want to settle in. These chatbots can help them find a marketplace where they can find help from the people willing to help. Data science and technology can also help solve refugees' many problems, including food, shelter, employment, security, and assimilation. The refugee problem seems to be one of the most challenging for social and political reasons. Data science and machine learning can help prevent the refugee crisis and solve or alleviate some of the problems that refugees face in their journey to a better life. With the explosion of data in the last decade, data science has made it possible to solve many geopolitical and social issues.Keywords: refugee crisis, artificial intelligence, data science, refugee camps, Afghanistan, Ukraine
Procedia PDF Downloads 73148 Developing Computational Thinking in Early Childhood Education
Authors: Kalliopi Kanaki, Michael Kalogiannakis
Abstract:
Nowadays, in the digital era, the early acquisition of basic programming skills and knowledge is encouraged, as it facilitates students’ exposure to computational thinking and empowers their creativity, problem-solving skills, and cognitive development. More and more researchers and educators investigate the introduction of computational thinking in K-12 since it is expected to be a fundamental skill for everyone by the middle of the 21st century, just like reading, writing and arithmetic are at the moment. In this paper, a doctoral research in the process is presented, which investigates the infusion of computational thinking into science curriculum in early childhood education. The whole attempt aims to develop young children’s computational thinking by introducing them to the fundamental concepts of object-oriented programming in an enjoyable, yet educational framework. The backbone of the research is the digital environment PhysGramming (an abbreviation of Physical Science Programming), which provides children the opportunity to create their own digital games, turning them from passive consumers to active creators of technology. PhysGramming deploys an innovative hybrid schema of visual and text-based programming techniques, with emphasis on object-orientation. Through PhysGramming, young students are familiarized with basic object-oriented programming concepts, such as classes, objects, and attributes, while, at the same time, get a view of object-oriented programming syntax. Nevertheless, the most noteworthy feature of PhysGramming is that children create their own digital games within the context of physical science courses, in a way that provides familiarization with the basic principles of object-oriented programming and computational thinking, even though no specific reference is made to these principles. Attuned to the ethical guidelines of educational research, interventions were conducted in two classes of second grade. The interventions were designed with respect to the thematic units of the curriculum of physical science courses, as a part of the learning activities of the class. PhysGramming was integrated into the classroom, after short introductory sessions. During the interventions, 6-7 years old children worked in pairs on computers and created their own digital games (group games, matching games, and puzzles). The authors participated in these interventions as observers in order to achieve a realistic evaluation of the proposed educational framework concerning its applicability in the classroom and its educational and pedagogical perspectives. To better examine if the objectives of the research are met, the investigation was focused on six criteria; the educational value of PhysGramming, its engaging and enjoyable characteristics, its child-friendliness, its appropriateness for the purpose that is proposed, its ability to monitor the user’s progress and its individualizing features. In this paper, the functionality of PhysGramming and the philosophy of its integration in the classroom are both described in detail. Information about the implemented interventions and the results obtained is also provided. Finally, several limitations of the research conducted that deserve attention are denoted.Keywords: computational thinking, early childhood education, object-oriented programming, physical science courses
Procedia PDF Downloads 120147 Cultural Dynamics in Online Consumer Behavior: Exploring Cross-Country Variances in Review Influence
Authors: Eunjung Lee
Abstract:
This research investigates the intricate connection between cultural differences and online consumer behaviors by integrating Hofstede's Cultural Dimensions theory with analysis methodologies such as text mining, data mining, and topic analysis. Our aim is to provide a comprehensive understanding of how national cultural differences influence individuals' behaviors when engaging with online reviews. To ensure the relevance of our investigation, we systematically analyze and interpret the cultural nuances influencing online consumer behaviors, especially in the context of online reviews. By anchoring our research in Hofstede's Cultural Dimensions theory, we seek to offer valuable insights for marketers to tailor their strategies based on the cultural preferences of diverse global consumer bases. In our methodology, we employ advanced text mining techniques to extract insights from a diverse range of online reviews gathered globally for a specific product or service like Netflix. This approach allows us to reveal hidden cultural cues in the language used by consumers from various backgrounds. Complementing text mining, data mining techniques are applied to extract meaningful patterns from online review datasets collected from different countries, aiming to unveil underlying structures and gain a deeper understanding of the impact of cultural differences on online consumer behaviors. The study also integrates topic analysis to identify recurring subjects, sentiments, and opinions within online reviews. Marketers can leverage these insights to inform the development of culturally sensitive strategies, enhance target audience segmentation, and refine messaging approaches aligned with cultural preferences. Anchored in Hofstede's Cultural Dimensions theory, our research employs sophisticated methodologies to delve into the intricate relationship between cultural differences and online consumer behaviors. Applied to specific cultural dimensions, such as individualism vs. collectivism, masculinity vs. femininity, uncertainty avoidance, and long-term vs. short-term orientation, the study uncovers nuanced insights. For example, in exploring individualism vs. collectivism, we examine how reviewers from individualistic cultures prioritize personal experiences while those from collectivistic cultures emphasize communal opinions. Similarly, within masculinity vs. femininity, we investigate whether distinct topics align with cultural notions, such as robust features in masculine cultures and user-friendliness in feminine cultures. Examining information-seeking behaviors under uncertainty avoidance reveals how cultures differ in seeking detailed information or providing succinct reviews based on their comfort with ambiguity. Additionally, in assessing long-term vs. short-term orientation, the research explores how cultural focus on enduring benefits or immediate gratification influences reviews. These concrete examples contribute to the theoretical enhancement of Hofstede's Cultural Dimensions theory, providing a detailed understanding of cultural impacts on online consumer behaviors. As online reviews become increasingly crucial in decision-making, this research not only contributes to the academic understanding of cultural influences but also proposes practical recommendations for enhancing online review systems. Marketers can leverage these findings to design targeted and culturally relevant strategies, ultimately enhancing their global marketing effectiveness and optimizing online review systems for maximum impact.Keywords: comparative analysis, cultural dimensions, marketing intelligence, national culture, online consumer behavior, text mining
Procedia PDF Downloads 47146 Artificial Intelligence Models for Detecting Spatiotemporal Crop Water Stress in Automating Irrigation Scheduling: A Review
Authors: Elham Koohi, Silvio Jose Gumiere, Hossein Bonakdari, Saeid Homayouni
Abstract:
Water used in agricultural crops can be managed by irrigation scheduling based on soil moisture levels and plant water stress thresholds. Automated irrigation scheduling limits crop physiological damage and yield reduction. Knowledge of crop water stress monitoring approaches can be effective in optimizing the use of agricultural water. Understanding the physiological mechanisms of crop responding and adapting to water deficit ensures sustainable agricultural management and food supply. This aim could be achieved by analyzing and diagnosing crop characteristics and their interlinkage with the surrounding environment. Assessments of plant functional types (e.g., leaf area and structure, tree height, rate of evapotranspiration, rate of photosynthesis), controlling changes, and irrigated areas mapping. Calculating thresholds of soil water content parameters, crop water use efficiency, and Nitrogen status make irrigation scheduling decisions more accurate by preventing water limitations between irrigations. Combining Remote Sensing (RS), the Internet of Things (IoT), Artificial Intelligence (AI), and Machine Learning Algorithms (MLAs) can improve measurement accuracies and automate irrigation scheduling. This paper is a review structured by surveying about 100 recent research studies to analyze varied approaches in terms of providing high spatial and temporal resolution mapping, sensor-based Variable Rate Application (VRA) mapping, the relation between spectral and thermal reflectance and different features of crop and soil. The other objective is to assess RS indices formed by choosing specific reflectance bands and identifying the correct spectral band to optimize classification techniques and analyze Proximal Optical Sensors (POSs) to control changes. The innovation of this paper can be defined as categorizing evaluation methodologies of precision irrigation (applying the right practice, at the right place, at the right time, with the right quantity) controlled by soil moisture levels and sensitiveness of crops to water stress, into pre-processing, processing (retrieval algorithms), and post-processing parts. Then, the main idea of this research is to analyze the error reasons and/or values in employing different approaches in three proposed parts reported by recent studies. Additionally, as an overview conclusion tried to decompose different approaches to optimizing indices, calibration methods for the sensors, thresholding and prediction models prone to errors, and improvements in classification accuracy for mapping changes.Keywords: agricultural crops, crop water stress detection, irrigation scheduling, precision agriculture, remote sensing
Procedia PDF Downloads 71145 A Smart Sensor Network Approach Using Affordable River Water Level Sensors
Authors: Dian Zhang, Brendan Heery, Maria O’Neill, Ciprian Briciu-Burghina, Noel E. O’Connor, Fiona Regan
Abstract:
Recent developments in sensors, wireless data communication and the cloud computing have brought the sensor web to a whole new generation. The introduction of the concept of ‘Internet of Thing (IoT)’ has brought the sensor research into a new level, which involves the developing of long lasting, low cost, environment friendly and smart sensors; new wireless data communication technologies; big data analytics algorithms and cloud based solutions that are tailored to large scale smart sensor network. The next generation of smart sensor network consists of several layers: physical layer, where all the smart sensors resident and data pre-processes occur, either on the sensor itself or field gateway; data transmission layer, where data and instructions exchanges happen; the data process layer, where meaningful information is extracted and organized from the pre-process data stream. There are many definitions of smart sensor, however, to summarize all these definitions, a smart sensor must be Intelligent and Adaptable. In future large scale sensor network, collected data are far too large for traditional applications to send, store or process. The sensor unit must be intelligent that pre-processes collected data locally on board (this process may occur on field gateway depends on the sensor network structure). In this case study, three smart sensing methods, corresponding to simple thresholding, statistical model and machine learning based MoPBAS method, are introduced and their strength and weakness are discussed as an introduction to the smart sensing concept. Data fusion, the integration of data and knowledge from multiple sources, are key components of the next generation smart sensor network. For example, in the water level monitoring system, weather forecast can be extracted from external sources and if a heavy rainfall is expected, the server can send instructions to the sensor notes to, for instance, increase the sampling rate or switch on the sleeping mode vice versa. In this paper, we describe the deployment of 11 affordable water level sensors in the Dublin catchment. The objective of this paper is to use the deployed river level sensor network at the Dodder catchment in Dublin, Ireland as a case study to give a vision of the next generation of a smart sensor network for flood monitoring to assist agencies in making decisions about deploying resources in the case of a severe flood event. Some of the deployed sensors are located alongside traditional water level sensors for validation purposes. Using the 11 deployed river level sensors in a network as a case study, a vision of the next generation of smart sensor network is proposed. Each key component of the smart sensor network is discussed, which hopefully inspires the researchers who are working in the sensor research domain.Keywords: smart sensing, internet of things, water level sensor, flooding
Procedia PDF Downloads 381144 Current Applications of Artificial Intelligence (AI) in Chest Radiology
Authors: Angelis P. Barlampas
Abstract:
Learning Objectives: The purpose of this study is to inform briefly the reader about the applications of AI in chest radiology. Background: Currently, there are 190 FDA-approved radiology AI applications, with 42 (22%) pertaining specifically to thoracic radiology. Imaging findings OR Procedure details Aids of AI in chest radiology1: Detects and segments pulmonary nodules. Subtracts bone to provide an unobstructed view of the underlying lung parenchyma and provides further information on nodule characteristics, such as nodule location, nodule two-dimensional size or three dimensional (3D) volume, change in nodule size over time, attenuation data (i.e., mean, minimum, and/or maximum Hounsfield units [HU]), morphological assessments, or combinations of the above. Reclassifies indeterminate pulmonary nodules into low or high risk with higher accuracy than conventional risk models. Detects pleural effusion . Differentiates tension pneumothorax from nontension pneumothorax. Detects cardiomegaly, calcification, consolidation, mediastinal widening, atelectasis, fibrosis and pneumoperitoneum. Localises automatically vertebrae segments, labels ribs and detects rib fractures. Measures the distance from the tube tip to the carina and localizes both endotracheal tubes and central vascular lines. Detects consolidation and progression of parenchymal diseases such as pulmonary fibrosis or chronic obstructive pulmonary disease (COPD).Can evaluate lobar volumes. Identifies and labels pulmonary bronchi and vasculature and quantifies air-trapping. Offers emphysema evaluation. Provides functional respiratory imaging, whereby high-resolution CT images are post-processed to quantify airflow by lung region and may be used to quantify key biomarkers such as airway resistance, air-trapping, ventilation mapping, lung and lobar volume, and blood vessel and airway volume. Assesses the lung parenchyma by way of density evaluation. Provides percentages of tissues within defined attenuation (HU) ranges besides furnishing automated lung segmentation and lung volume information. Improves image quality for noisy images with built-in denoising function. Detects emphysema, a common condition seen in patients with history of smoking and hyperdense or opacified regions, thereby aiding in the diagnosis of certain pathologies, such as COVID-19 pneumonia. It aids in cardiac segmentation and calcium detection, aorta segmentation and diameter measurements, and vertebral body segmentation and density measurements. Conclusion: The future is yet to come, but AI already is a helpful tool for the daily practice in radiology. It is assumed, that the continuing progression of the computerized systems and the improvements in software algorithms , will redder AI into the second hand of the radiologist.Keywords: artificial intelligence, chest imaging, nodule detection, automated diagnoses
Procedia PDF Downloads 72143 Harnessing the Power of Artificial Intelligence: Advancements and Ethical Considerations in Psychological and Behavioral Sciences
Authors: Nayer Mofidtabatabaei
Abstract:
Advancements in artificial intelligence (AI) have transformed various fields, including psychology and behavioral sciences. This paper explores the diverse ways in which AI is applied to enhance research, diagnosis, therapy, and understanding of human behavior and mental health. We discuss the potential benefits and challenges associated with AI in these fields, emphasizing the ethical considerations and the need for collaboration between AI researchers and psychological and behavioral science experts. Artificial Intelligence (AI) has gained prominence in recent years, revolutionizing multiple industries, including healthcare, finance, and entertainment. One area where AI holds significant promise is the field of psychology and behavioral sciences. AI applications in this domain range from improving the accuracy of diagnosis and treatment to understanding complex human behavior patterns. This paper aims to provide an overview of the various AI applications in psychological and behavioral sciences, highlighting their potential impact, challenges, and ethical considerations. Mental Health Diagnosis AI-driven tools, such as natural language processing and sentiment analysis, can analyze large datasets of text and speech to detect signs of mental health issues. For example, chatbots and virtual therapists can provide initial assessments and support to individuals suffering from anxiety or depression. Autism Spectrum Disorder (ASD) Diagnosis AI algorithms can assist in early ASD diagnosis by analyzing video and audio recordings of children's behavior. These tools help identify subtle behavioral markers, enabling earlier intervention and treatment. Personalized Therapy AI-based therapy platforms use personalized algorithms to adapt therapeutic interventions based on an individual's progress and needs. These platforms can provide continuous support and resources for patients, making therapy more accessible and effective. Virtual Reality Therapy Virtual reality (VR) combined with AI can create immersive therapeutic environments for treating phobias, PTSD, and social anxiety. AI algorithms can adapt VR scenarios in real-time to suit the patient's progress and comfort level. Data Analysis AI aids researchers in processing vast amounts of data, including survey responses, brain imaging, and genetic information. Privacy Concerns Collecting and analyzing personal data for AI applications in psychology and behavioral sciences raise significant privacy concerns. Researchers must ensure the ethical use and protection of sensitive information. Bias and Fairness AI algorithms can inherit biases present in training data, potentially leading to biased assessments or recommendations. Efforts to mitigate bias and ensure fairness in AI applications are crucial. Transparency and Accountability AI-driven decisions in psychology and behavioral sciences should be transparent and subject to accountability. Patients and practitioners should understand how AI algorithms operate and make decisions. AI applications in psychological and behavioral sciences have the potential to transform the field by enhancing diagnosis, therapy, and research. However, these advancements come with ethical challenges that require careful consideration. Collaboration between AI researchers and psychological and behavioral science experts is essential to harness AI's full potential while upholding ethical standards and privacy protections. The future of AI in psychology and behavioral sciences holds great promise, but it must be navigated with caution and responsibility.Keywords: artificial intelligence, psychological sciences, behavioral sciences, diagnosis and therapy, ethical considerations
Procedia PDF Downloads 71142 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images
Authors: Elham Bagheri, Yalda Mohsenzadeh
Abstract:
Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception
Procedia PDF Downloads 91141 Leadership Education for Law Enforcement Mid-Level Managers: The Mediating Role of Effectiveness of Training on Transformational and Authentic Leadership Traits
Authors: Kevin Baxter, Ron Grove, James Pitney, John Harrison, Ozlem Gumus
Abstract:
The purpose of this research is to determine the mediating effect of effectiveness of the training provided by Northwestern University’s School of Police Staff and Command (SPSC), on the ability of law enforcement mid-level managers to learn transformational and authentic leadership traits. This study will also evaluate the leadership styles, of course, graduates compared to non-attendees using a static group comparison design. The Louisiana State Police pay approximately $40,000 in salary, tuition, housing, and meals for each state police lieutenant attending the 10-week program of the SPSC. This school lists the development of transformational leaders as an increasing element. Additionally, the SPSC curriculum addresses all four components of authentic leadership - self-awareness, transparency, ethical/moral, and balanced processing. Upon return to law enforcement in roles of mid-level management, there are questions as to whether or not students revert to an “autocratic” leadership style. Insufficient evidence exists to support claims for the effectiveness of management training or leadership development. Though it is widely recognized that transformational styles are beneficial to law enforcement, there is little evidence that suggests police leadership styles are changing. Police organizations continue to hold to a more transactional style (i.e., most senior police leaders remain autocrats). Additionally, research in the application of transformational, transactional, and laissez-faire leadership related to police organizations is minimal. The population of the study is law enforcement mid-level managers from various states within the United States who completed leadership training presented by the SPSC. The sample will be composed of 66 active law enforcement mid-level managers (lieutenants and captains) who have graduated from SPSC and 65 active law enforcement mid-level managers (lieutenants and captains) who have not attended SPSC. Participants will answer demographics questions, Multifactor Leadership Questionnaire, Authentic Leadership Questionnaire, and the Kirkpatrick Hybrid Evaluation Survey. Analysis from descriptive statistics, group comparison, one-way MANCOVA, and the Kirkpatrick Evaluation Model survey will be used to determine training effectiveness in the four levels of reaction, learning, behavior, and results. Independent variables are SPSC graduates (two groups: upper and lower) and no-SPSC attendees, and dependent variables are transformational and authentic leadership scores. SPSC graduates are expected to have higher MLQ scores for transformational leadership traits and higher ALQ scores for authentic leadership traits than SPSC non-attendees. We also expect the graduates to rate the efficacy of SPSC leadership training as high. This study will validate (or invalidate) the benefits, costs, and resources required for leadership development from a nationally recognized police leadership program, and it will also help fill the gap in the literature that exists between law enforcement professional development and transformational and authentic leadership styles.Keywords: training effectiveness, transformational leadership, authentic leadership, law enforcement mid-level manager
Procedia PDF Downloads 105140 Employing Remotely Sensed Soil and Vegetation Indices and Predicting by Long Short-Term Memory to Irrigation Scheduling Analysis
Authors: Elham Koohikerade, Silvio Jose Gumiere
Abstract:
In this research, irrigation is highlighted as crucial for improving both the yield and quality of potatoes due to their high sensitivity to soil moisture changes. The study presents a hybrid Long Short-Term Memory (LSTM) model aimed at optimizing irrigation scheduling in potato fields in Quebec City, Canada. This model integrates model-based and satellite-derived datasets to simulate soil moisture content, addressing the limitations of field data. Developed under the guidance of the Food and Agriculture Organization (FAO), the simulation approach compensates for the lack of direct soil sensor data, enhancing the LSTM model's predictions. The model was calibrated using indices like Surface Soil Moisture (SSM), Normalized Vegetation Difference Index (NDVI), Enhanced Vegetation Index (EVI), and Normalized Multi-band Drought Index (NMDI) to effectively forecast soil moisture reductions. Understanding soil moisture and plant development is crucial for assessing drought conditions and determining irrigation needs. This study validated the spectral characteristics of vegetation and soil using ECMWF Reanalysis v5 (ERA5) and Moderate Resolution Imaging Spectrometer (MODIS) data from 2019 to 2023, collected from agricultural areas in Dolbeau and Peribonka, Quebec. Parameters such as surface volumetric soil moisture (0-7 cm), NDVI, EVI, and NMDI were extracted from these images. A regional four-year dataset of soil and vegetation moisture was developed using a machine learning approach combining model-based and satellite-based datasets. The LSTM model predicts soil moisture dynamics hourly across different locations and times, with its accuracy verified through cross-validation and comparison with existing soil moisture datasets. The model effectively captures temporal dynamics, making it valuable for applications requiring soil moisture monitoring over time, such as anomaly detection and memory analysis. By identifying typical peak soil moisture values and observing distribution shapes, irrigation can be scheduled to maintain soil moisture within Volumetric Soil Moisture (VSM) values of 0.25 to 0.30 m²/m², avoiding under and over-watering. The strong correlations between parcels suggest that a uniform irrigation strategy might be effective across multiple parcels, with adjustments based on specific parcel characteristics and historical data trends. The application of the LSTM model to predict soil moisture and vegetation indices yielded mixed results. While the model effectively captures the central tendency and temporal dynamics of soil moisture, it struggles with accurately predicting EVI, NDVI, and NMDI.Keywords: irrigation scheduling, LSTM neural network, remotely sensed indices, soil and vegetation monitoring
Procedia PDF Downloads 41139 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence
Authors: Sogand Barghi
Abstract:
The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting
Procedia PDF Downloads 71138 Scaling up Small and Sick Newborn Care Through the Establishment of the First Human Milk Bank in Nepal
Authors: Prajwal Paudel, Shreeprasad Adhikari, Shailendra Bir Karmacharya, Kalpana Upadhyaya
Abstract:
Background: Human milk banks have been recommended by the World Health Organization (WHO) for newborn and child nourishment in the provision of optimum nutrition as an alternative to breastfeeding in circumstances when direct breastfeeding is inaccessible. The vulnerable group of babies, mainly preterm, low birth weight, and sick newborns, are at a greater risk of mortality and possibly benefit from the safe use of donated human milk through milk banks. In this study, we aimed to shed light on the process involved during the setting up of the nation’s first milk bank and its vitality in small and sick newborn nutrition and care. Methods: The study was conducted in Paropakar Maternity and Women’s Hospital, where the first human milk (HMB) was established. The establishment involved a stepwise process of need assessment meeting, formation of the HMB committee, learning visit to HMB in India, studying the strengths and weaknesses of promoting breastfeeding and HMB system integration, procurement, installation, and setting up the infrastructure, and developing technical competency, launching of the HMB. After the initiation of HMB services, information regarding the recruited donor mothers and the volume of milk pasteurized and consumed by the needy recipient babies were recorded. Descriptive statistics with frequencies and percentages were used to describe the utilization of HMB services. Results: During the study period, a total of 506113 ml of milk was collected, while 49930 ml of milk was pasteurized. Of the pasteurized milk, 381248 ml of milk was dispensed. The total volume of milk received was from a total of 883 after proper routine screening tests. Similarly, the total number of babies who received the donated human milk (DHM) was 912 with different neonatal conditions. Among the babies who received DHM, 527(57.7%) were born via CS, and 385 (42.21%) were delivered normally. In the birth weight category,9 (1%) of the babies were less than 1000 grams, 75 (8.2%) were less than 1500 grams, 405 (44.4%) were between 1500 to less than 2500 grams whereas, 423 (46.4%) of the babies who received DHM were normal weight babies. Among the sick newborns, perinatal asphyxia accounted for 166 (18.2%), preterm with other complications 372 (40.7%), preterm 23 (2.02%), respiratory distress 140 (15.35%), neonatal jaundice 150 (16.44%), sepsis 94 (10.30%), meconium aspiration syndrome 9(1%), seizure disorder 28 (3.07%), congenital anomalies 13 (1.42%) and others 33(3. 61%). The neonatal mortality rate dropped to 6.2/1000 live births from 7.5/1000 live births in the first year of establishment as compared to the previous year. Conclusion: The establishment of the first HMB in Nepal involved a comprehensive approach to integrate a new system with the existing newborn care in the provision of safe DHM. Premature babies with complication, babies born via CS, perinatal asphyxia and babies with sepsis consumed the greater proportion of DHM. Rigorous research is warranted to assess the impact of DHM in small and sick newborn who otherwise would be fed formula milk.Keywords: human milk bank, sick-newborn, mortality, neonatal nutrition
Procedia PDF Downloads 11137 The Academic Importance of the Arts in Fostering Belonging
Authors: Ana Handel, Jamal Ellerbe, Sarah Kanzaki, Natalie White, Nathan Ousey, Sean Gallagher
Abstract:
A sense of belonging is the ability for individuals to feel they are a necessary part of whatever organization or community they find themselves in. In an academic setting, a sense of belonging is key to a student’s success. The collected research points to this sense of belonging in academic settings as a significant contributor of students’ levels of engagement and trust. When universities leverage the arts, students are provided with more opportunities to engage and feel confident in their surroundings. This allows for greater potential to develop within academic and social settings. The arts also call for the promotion of diversity, equity, and inclusion by showcasing works of artists from all different backgrounds, thus allowing students to gain cultural knowledge and be able to embrace differences. Equity, diversity, and inclusion are all emotional facets of belonging. Equity relates to the concept of making the conscious choice to recognize opportunities to incorporate inclusive and diverse ideals into different thought processes and collaboration. Inclusion involves providing equal access to opportunities and resources for people of all ‘ingroups. In an inclusive culture, individuals are able to maximize their potential with the confidence they have gained through an accepting environment. A variety of members in academic communities have noted it may be beneficial to make certain events surrounding the arts to be built into course requirements in order to ensure students are expanding their horizons and exposing themselves to the arts. These academics also recommend incorporating the arts into extracurricular activities, such as Greek life, in order to appeal to large groups of students. Once students have an understanding of the rich knowledge cultivated through exploring the arts, they will feel more comfortable in their surroundings and thus more confident to become involved in other areas of their university. A number of universities, including West Chester and Carnegie Mellon, have instituted programs aiming to provide students with the necessary tools and resources to feel comfortable in their educational settings. Different programs include references to hotlines for discrimination and office for diversity, equity, and inclusion. Staff members have also been provided with means of combating biases and increasing feelings of belongingness in order to properly support and communicate with students. These tools have successfully allowed universities to foster inviting environments for students of all backgrounds to feel belong as well as strengthening the community’s diversity, equity, and inclusion. Through demonstrating concepts of diversity, equity, and inclusion by introducing the arts into learning spaces, students can find a sense of belonging within their academic environments. It is essential to understand these topics and how they work together to achieve a common goal. The efforts of universities have made much progress in shedding light on different cultures and ideas to show students their full potential and opportunities. Once students feel more comfortable within their organizations, engagement will increase substantially.Keywords: arts, belonging, engagement, inclusion
Procedia PDF Downloads 165136 Bridging the Educational Gap: A Curriculum Framework for Mass Timber Construction Education and Comparative Analysis of Physical vs. Virtual Prototypes in Construction Management
Authors: Farnaz Jafari
Abstract:
The surge in mass timber construction represents a pivotal moment in sustainable building practices, yet the lack of comprehensive education in construction management poses a challenge in harnessing this innovation effectively. This research endeavors to bridge this gap by developing a curriculum framework integrating mass timber construction into undergraduate and industry certificate programs. To optimize learning outcomes, the study explores the impact of two prototype formats -Virtual Reality (VR) simulations and physical mock-ups- on students' understanding and skill development. The curriculum framework aims to equip future construction managers with a holistic understanding of mass timber, covering its unique properties, construction methods, building codes, and sustainable advantages. The study adopts a mixed-methods approach, commencing with a systematic literature review and leveraging surveys and interviews with educators and industry professionals to identify existing educational gaps. The iterative development process involves incorporating stakeholder feedback into the curriculum. The evaluation of prototype impact employs pre- and post-tests administered to participants engaged in pilot programs. Through qualitative content analysis and quantitative statistical methods, the study seeks to compare the effectiveness of VR simulations and physical mock-ups in conveying knowledge and skills related to mass timber construction. The anticipated findings will illuminate the strengths and weaknesses of each approach, providing insights for future curriculum development. The curriculum's expected contribution to sustainable construction education lies in its emphasis on practical application, bridging the gap between theoretical knowledge and hands-on skills. The research also seeks to establish a standard for mass timber construction education, contributing to the field through a unique comparative analysis of VR simulations and physical mock-ups. The study's significance extends to the development of best practices and evidence-based recommendations for integrating technology and hands-on experiences in construction education. By addressing current educational gaps and offering a comparative analysis, this research aims to enrich the construction management education experience and pave the way for broader adoption of sustainable practices in the industry. The envisioned curriculum framework is designed for versatile integration, catering to undergraduate programs and industry training modules, thereby enhancing the educational landscape for aspiring construction professionals. Ultimately, this study underscores the importance of proactive educational strategies in preparing industry professionals for the evolving demands of the construction landscape, facilitating a seamless transition towards sustainable building practices.Keywords: curriculum framework, mass timber construction, physical vs. virtual prototypes, sustainable building practices
Procedia PDF Downloads 72135 Digital Skepticism In A Legal Philosophical Approach
Authors: dr. Bendes Ákos
Abstract:
Digital skepticism, a critical stance towards digital technology and its pervasive influence on society, presents significant challenges when analyzed from a legal philosophical perspective. This abstract aims to explore the intersection of digital skepticism and legal philosophy, emphasizing the implications for justice, rights, and the rule of law in the digital age. Digital skepticism arises from concerns about privacy, security, and the ethical implications of digital technology. It questions the extent to which digital advancements enhance or undermine fundamental human values. Legal philosophy, which interrogates the foundations and purposes of law, provides a framework for examining these concerns critically. One key area where digital skepticism and legal philosophy intersect is in the realm of privacy. Digital technologies, particularly data collection and surveillance mechanisms, pose substantial threats to individual privacy. Legal philosophers must grapple with questions about the limits of state power and the protection of personal autonomy. They must consider how traditional legal principles, such as the right to privacy, can be adapted or reinterpreted in light of new technological realities. Security is another critical concern. Digital skepticism highlights vulnerabilities in cybersecurity and the potential for malicious activities, such as hacking and cybercrime, to disrupt legal systems and societal order. Legal philosophy must address how laws can evolve to protect against these new forms of threats while balancing security with civil liberties. Ethics plays a central role in this discourse. Digital technologies raise ethical dilemmas, such as the development and use of artificial intelligence and machine learning algorithms that may perpetuate biases or make decisions without human oversight. Legal philosophers must evaluate the moral responsibilities of those who design and implement these technologies and consider the implications for justice and fairness. Furthermore, digital skepticism prompts a reevaluation of the concept of the rule of law. In an increasingly digital world, maintaining transparency, accountability, and fairness becomes more complex. Legal philosophers must explore how legal frameworks can ensure that digital technologies serve the public good and do not entrench power imbalances or erode democratic principles. Finally, the intersection of digital skepticism and legal philosophy has practical implications for policy-making. Legal scholars and practitioners must work collaboratively to develop regulations and guidelines that address the challenges posed by digital technology. This includes crafting laws that protect individual rights, ensure security, and promote ethical standards in technology development and deployment. In conclusion, digital skepticism provides a crucial lens for examining the impact of digital technology on law and society. A legal philosophical approach offers valuable insights into how legal systems can adapt to protect fundamental values in the digital age. By addressing privacy, security, ethics, and the rule of law, legal philosophers can help shape a future where digital advancements enhance, rather than undermine, justice and human dignity.Keywords: legal philosophy, privacy, security, ethics, digital skepticism
Procedia PDF Downloads 44134 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI
Procedia PDF Downloads 173133 Leveraging Multimodal Neuroimaging Techniques to in vivo Address Compensatory and Disintegration Patterns in Neurodegenerative Disorders: Evidence from Cortico-Cerebellar Connections in Multiple Sclerosis
Authors: Efstratios Karavasilis, Foteini Christidi, Georgios Velonakis, Agapi Plousi, Kalliopi Platoni, Nikolaos Kelekis, Ioannis Evdokimidis, Efstathios Efstathopoulos
Abstract:
Introduction: Advanced structural and functional neuroimaging techniques contribute to the study of anatomical and functional brain connectivity and its role in the pathophysiology and symptoms’ heterogeneity in several neurodegenerative disorders, including multiple sclerosis (MS). Aim: In the present study, we applied multiparametric neuroimaging techniques to investigate the structural and functional cortico-cerebellar changes in MS patients. Material: We included 51 MS patients (28 with clinically isolated syndrome [CIS], 31 with relapsing-remitting MS [RRMS]) and 51 age- and gender-matched healthy controls (HC) who underwent MRI in a 3.0T MRI scanner. Methodology: The acquisition protocol included high-resolution 3D T1 weighted, diffusion-weighted imaging and echo planar imaging sequences for the analysis of volumetric, tractography and functional resting state data, respectively. We performed between-group comparisons (CIS, RRMS, HC) using CAT12 and CONN16 MATLAB toolboxes for the analysis of volumetric (cerebellar gray matter density) and functional (cortico-cerebellar resting-state functional connectivity) data, respectively. Brainance suite was used for the analysis of tractography data (cortico-cerebellar white matter integrity; fractional anisotropy [FA]; axial and radial diffusivity [AD; RD]) to reconstruct the cerebellum tracts. Results: Patients with CIS did not show significant gray matter (GM) density differences compared with HC. However, they showed decreased FA and increased diffusivity measures in cortico-cerebellar tracts, and increased cortico-cerebellar functional connectivity. Patients with RRMS showed decreased GM density in cerebellar regions, decreased FA and increased diffusivity measures in cortico-cerebellar WM tracts, as well as a pattern of increased and mostly decreased functional cortico-cerebellar connectivity compared to HC. The comparison between CIS and RRMS patients revealed significant GM density difference, reduced FA and increased diffusivity measures in WM cortico-cerebellar tracts and increased/decreased functional connectivity. The identification of decreased WM integrity and increased functional cortico-cerebellar connectivity without GM changes in CIS and the pattern of decreased GM density decreased WM integrity and mostly decreased functional connectivity in RRMS patients emphasizes the role of compensatory mechanisms in early disease stages and the disintegration of structural and functional networks with disease progression. Conclusions: In conclusion, our study highlights the added value of multimodal neuroimaging techniques for the in vivo investigation of cortico-cerebellar brain changes in neurodegenerative disorders. An extension and future opportunity to leverage multimodal neuroimaging data inevitably remain the integration of such data in the recently-applied mathematical approaches of machine learning algorithms to more accurately classify and predict patients’ disease course.Keywords: advanced neuroimaging techniques, cerebellum, MRI, multiple sclerosis
Procedia PDF Downloads 140