Search results for: multiple data
27507 Analyzing Preservice Teachers’ Attitudes toward Technology
Authors: Ahmet Oguz Akturk, Kemal Izci, Gurbuz Caliskan, Ismail Sahin
Abstract:
Rapid developments in technology are to necessitate societies to closely follow technological developments and change themselves to adopt those developments. It is obvious that one of the areas that are impacted from technological developments is education. Analyzing preservice teachers’ attitudes toward technology is crucial for both educational and professional purposes since teacher candidates are essential for educating future individual living in technological age. In this study, it is aimed to analyze preservice teachers’ attitudes toward technology and some variables (e.g., gender, daily internet usage and possessed technological devices) that predicting those attitudes. In this study, relational survey model used as research method and 329 preservice teachers who are studying in a large university located at the middle part of Turkey are voluntarily participated. Results of the study showed that mostly preservice teachers displayed positive attitudes toward technology while male preservice teachers’ attitudes toward technology was more positive than female preservice teachers. In order to analyze predicting factors for preservice teachers’ attitudes toward technology, stepwise multiple regressions were utilized. The results of stepwise multiple regression showed that daily internet use was the most strong predicting factor for predicting preservice teachers’ attitudes toward technology.Keywords: attitudes toward technology, preservice teachers, gender, stepwise multiple regression analysis
Procedia PDF Downloads 29127506 Defining Methodology for Multi Model Software Process Improvement Framework
Authors: Aedah Abd Rahman
Abstract:
Software organisations may implement single or multiple frameworks in order to remain competitive. There are wide selection of generic Software Process Improvement (SPI) frameworks, best practices and standards implemented with different focuses and goals. Issues and difficulties emerge in the SPI practices from the context of software development and IT Service Management (ITSM). This research looks into the integration of multiple frameworks from the perspective of software development and ITSM. The research question of this study is how to define steps of methodology to solve the multi model software process improvement problem. The objective of this study is to define the research approach and methodologies to produce a more integrated and efficient Multi Model Process Improvement (MMPI) solution. A multi-step methodology is used which contains the case study, framework mapping and Delphi study. The research outcome has proven the usefulness and appropriateness of the proposed framework in SPI and quality practice in Malaysian software industries. This mixed method research approach is used to tackle problems from every angle in the context of software development and services. This methodology is used to facilitate the implementation and management of multi model environment of SPI frameworks in multiple domains.Keywords: Delphi study, methodology, multi model software process improvement, service management
Procedia PDF Downloads 26027505 An Ultra-Low Output Impedance Power Amplifier for Tx Array in 7-Tesla Magnetic Resonance Imaging
Authors: Ashraf Abuelhaija, Klaus Solbach
Abstract:
In Ultra high-field MRI scanners (3T and higher), parallel RF transmission techniques using multiple RF chains with multiple transmit elements are a promising approach to overcome the high-field MRI challenges in terms of inhomogeneity in the RF magnetic field and SAR. However, mutual coupling between the transmit array elements disturbs the desirable independent control of the RF waveforms for each element. This contribution demonstrates a 18 dB improvement of decoupling (isolation) performance due to the very low output impedance of our 1 kW power amplifier.Keywords: EM coupling, inter-element isolation, magnetic resonance imaging (mri), parallel transmit
Procedia PDF Downloads 49527504 The Use of Mobile Phone as Enhancement to Mark Multiple Choice Objectives English Grammar and Literature Examination: An Exploratory Case Study of Preliminary National Diploma Students, Abdu Gusau Polytechnic, Talata Mafara, Zamfara State, Nigeria
Authors: T. Abdulkadir
Abstract:
Most often, marking and assessment of multiple choice kinds of examinations have been opined by many as a cumbersome and herculean task to accomplished manually in Nigeria. Usually this may be in obvious nexus to the fact that mass numbers of candidates were known to take the same examination simultaneously. Eventually, marking such a mammoth number of booklets dared and dread even the fastest paid examiners who often undertake the job with the resulting consequences of stress and boredom. This paper explores the evolution, as well as the set aim to envision and transcend marking the Multiple Choice Objectives- type examination into a thing of creative recreation, or perhaps a more relaxing activity via the use of the mobile phone. A more “pragmatic” dimension method was employed to achieve this work, rather than the formal “in-depth research” based approach due to the “novelty” of the mobile-smartphone e-Marking Scheme discovery. Moreover, being an evolutionary scheme, no recent academic work shares a direct same topic concept with the ‘use of cell phone as an e-marking technique’ was found online; thus, the dearth of even miscellaneous citations in this work. Additional future advancements are what steered the anticipatory motive of this paper which laid the fundamental proposition. However, the paper introduces for the first time the concept of mobile-smart phone e-marking, the steps to achieve it, as well as the merits and demerits of the technique all spelt out in the subsequent pages.Keywords: cell phone, e-marking scheme (eMS), mobile phone, mobile-smart phone, multiple choice objectives (MCO), smartphone
Procedia PDF Downloads 25927503 The Mediating Role of Store Personality in the Relationship Between Self-Congruity and Manifestations of Loyalty
Authors: María de los Ángeles Crespo López, Carmen García García
Abstract:
The highly competitive nature of today's globalised marketplace requires that brands and stores develop effective commercial strategies to ensure their economic survival. Maintaining the loyalty of existing customers constitutes one key strategy that yields the best results. Although the relationship between consumers' self-congruity and their manifestations of loyalty towards a store has been investigated, the role of store personality in this relationship remains unclear. In this study, multiple parallel mediation analysis was used to examine the effect of Store Personality on the relationship between Self-Congruity of consumers and their Manifestations of Loyalty. For this purpose, 457 Spanish consumers of the Fnac store completed three self-report questionnaires assessing Store Personality, Self-Congruity, and Store Loyalty. The data were analyzed using the SPSS macro PROCESS. The results revealed that three dimensions of Store Personality, namely Exciting, Close and Competent Store, positively and significantly mediated the relationship between Self-Congruity and Manifestations of Loyalty. The indirect effect of Competent Store was the greatest. This means that a consumer with higher levels of Self-Congruity with the store will exhibit more Manifestations of Loyalty when the store is perceived as Exciting, Close or Competent. These findings suggest that more attention should be paid to the perceived personality of stores for the development of effective marketing strategies to maintain or increase consumers' manifestations of loyalty towards stores.Keywords: multiple parallel mediation, PROCESS, self-congruence, store loyalty, store personality
Procedia PDF Downloads 15827502 Groundhog Day as a Model for the Repeating Spectator and the Film Academic: Re-Watching the Same Films Again Can Create Different Experiences and Ideas
Authors: Leiya Ho Yin Lee
Abstract:
Groundhog Day (Harold Ramis, 1993) may seemingly be a fairly unremarkable Hollywood comedy film in the 90s, it is argued that the film, with its protagonist Phil (Bill Murray), inadvertently, but perfectly, demonstrates an important aspect in filmmaking, film spectatorship and film research: repetition. Very rarely does a narrative film use one, and only one, take in its shooting. The multiple ‘repeats’ of Phil’s various endeavours due to his being trapped in a perpetual loop of the same day — from stealing money and tricking a woman into a casual relationship, to his multiple suicides, to eventually helping people in need — make the process of doing multiple ‘takes’ in filmmaking explicit. But perhaps more significantly, Phil represents a perfect model for the spectator/cinephile who has seen their favourite film for multiple times that they can remember every single detail. Crucially, their favourite film never changes, as it is a recording, but the cinephile’s experience of that very same film is most likely different each time they watch it again, just as Phil’s character and personality has completely transformed, from selfish and egotistic, to depressed and nihilistic, and ultimately to sympathetic and caring, even though he is living the exact same day. Furthermore, the author did not come up with this stimulating juxtaposition of film spectatorship and Groundhog Day the first time the author saw the film; it took the author a few casual re-viewings to notice the film’s self-reflexivity. And then, when working on it in the author’s research, the author had to re-view the film for more times, and have subsequently noticed even more things previously unnoticed. In this way, Groundhog Day not only stands for a model for filmmaking and film spectatorship, it also illustrates the act of academic research, especially in Film Studies where repeatedly viewing the same films is a prerequisite before new ideas and concepts are discovered from old material. This also recalls Deleuze’s thesis on difference and repetition in that repetition creates difference and it is difference that creates thought.Keywords: narrative comprehension, repeated viewing, repetition, spectatorship
Procedia PDF Downloads 32027501 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 3927500 Pathological Disparities in Patients Diagnosed with Prostate Imaging Reporting and Data System 3 Lesions: A Retrospective Study in a High-Volume Academic Center
Authors: M. Reza Roshandel, Tannaz Aghaei Badr, Batoul Khoundabi, Sara C. Lewis, Soroush Rais-Bahrami, John Sfakianos, Reza Mehrazin, Ash K. Tewari
Abstract:
Introduction: Prostate biopsy is the most reliable diagnostic method for choosing the appropriate management of prostate cancer. However, discrepancies between Gleason grade groups (GG) of different biopsies remain a significant concern. This study aims to assess the association of the radiological factors with GG discrepancies in patients with index Prostate Imaging Reporting and Data System (PI-RADS) 3 lesions, using radical prostatectomy (RP) specimens as the most accurate and informative pathology. Methods: This single-institutional retrospective study was performed on a total of 2289 consecutive prostate cancer patients with combined targeted and systematic prostate biopsy followed by radical prostatectomy (RP). The database was explored for patients with the index PI-RADS 3 lesions version 2 and 2.1. Cancers with PI-RADS 4 or 5 scoring were excluded from the study. Patient characteristics and radiologic features were analyzed by multivariable logistic regression. Number-density of lesions was defined as the number of lesions per prostatic volume. Results: Of the 151 prostate cancer cases with PI-RADS 3 index lesions, 27% and 17% had upgrades and downgrades at RP, respectively. Analysis of grade changes showed no significant associations between discrepancies and the number or the number density of PI-RADS 3 lesions. Moreover, the study showed no significant association of the GG changes with race, age, location of the lesions, or prostate volume. Conclusions: This study demonstrated that in PI-RADS 3 cancerous nodules, the chance of the pathology changes in the final pathology of RP specimens was low. Furthermore, having multiple PI-RADS 3 nodules did not change the conclusion, as the possibility of grade changes in patients with multiple nodules was similar to those with solitary lesions.Keywords: prostate, adenocarcinoma, multiparametric MRI, Gleason score, robot-assisted surgery
Procedia PDF Downloads 13327499 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency
Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu
Abstract:
In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal
Procedia PDF Downloads 15127498 Optimal Design of Step-Stress Partially Life Test Using Multiply Censored Exponential Data with Random Removals
Authors: Showkat Ahmad Lone, Ahmadur Rahman, Ariful Islam
Abstract:
The major assumption in accelerated life tests (ALT) is that the mathematical model relating the lifetime of a test unit and the stress are known or can be assumed. In some cases, such life–stress relationships are not known and cannot be assumed, i.e. ALT data cannot be extrapolated to use condition. So, in such cases, partially accelerated life test (PALT) is a more suitable test to be performed for which tested units are subjected to both normal and accelerated conditions. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests using progressive failure-censored hybrid data with random removals. The life data of the units under test is considered to follow exponential life distribution. The removals from the test are assumed to have binomial distributions. The point and interval maximum likelihood estimations are obtained for unknown distribution parameters and tampering coefficient. An optimum test plan is developed using the D-optimality criterion. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.Keywords: binomial distribution, d-optimality, multiple censoring, optimal design, partially accelerated life testing, simulation study
Procedia PDF Downloads 31827497 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis
Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski
Abstract:
The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.Keywords: cloud service, geodata cube, multiresolution, raster geodata
Procedia PDF Downloads 13527496 Signaling Using Phase Shifting in Wi-Fi Backscatter System
Authors: Chang-Bin Ha, Young-Min Ko, Seongjoo Lee, Hyoung-Kyu Song
Abstract:
In this paper, the signaling scheme using phase shifting is proposed for the improved performance of the Wi-Fi backscatter system. Because the communication in the Wi-Fi backscatter system is based on on-off modulation and impedance modulation by unit of packet, the data rate is very low compared to the conventional wireless systems. Also, because the Wi-Fi backscatter system is based on the RF-powered device, the achievement of high reliability is difficult. In order to increase the low data rate, the proposed scheme transmits information of multiple bits during one packet period. Also, in order to increase the reliability, the proposed scheme shifts the phase of signal in according to the transmitting information. The simulation result shows that the proposed scheme has the improved throughput performance.Keywords: phase shifting, RF-powered device, Wi-Fi backscatter system, IoT
Procedia PDF Downloads 44227495 Symbolic Computation for the Multi-Soliton Solutions of a Class of Fifth-Order Evolution Equations
Authors: Rafat Alshorman, Fadi Awawdeh
Abstract:
By employing a simplified bilinear method, a class of generalized fifth-order KdV (gfKdV) equations which arise in nonlinear lattice, plasma physics and ocean dynamics are investigated. With the aid of symbolic computation, both solitary wave solutions and multiple-soliton solutions are obtained. These new exact solutions will extend previous results and help us explain the properties of nonlinear solitary waves in many physical models in shallow water. Parametric analysis is carried out in order to illustrate that the soliton amplitude, width and velocity are affected by the coefficient parameters in the equation.Keywords: multiple soliton solutions, fifth-order evolution equations, Cole-Hopf transformation, Hirota bilinear method
Procedia PDF Downloads 31927494 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider
Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón
Abstract:
The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.Keywords: AD0, ALICE, DCS, LHC
Procedia PDF Downloads 30527493 Virtual Reality and Avatars in Education
Authors: Michael Brazley
Abstract:
Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.Keywords: virtual reality, avatars, education, XR
Procedia PDF Downloads 9827492 Resistance Spot Welding of Boron Steel 22MnB5 with Complex Welding Programs
Authors: Szymon Kowieski, Zygmunt Mikno
Abstract:
The study involved the optimization of process parameters during resistance spot welding of Al-coated martensitic boron steel 22MnB5, applied in hot stamping, performed using a programme with a multiple current impulse mode and a programme with variable pressure force. The aim of this research work was to determine the possibilities of a growth in welded joint strength and to identify the expansion of a welding lobe. The process parameters were adjusted on the basis of welding process simulation and confronted with experimental data. 22MnB5 steel is known for its tendency to obtain high hardness values in weld nuggets, often leading to interfacial failures (observed in the study-related tests). In addition, during resistance spot welding, many production-related factors can affect process stability, e.g. welding lobe narrowing, and lead to the deterioration of quality. Resistance spot welding performed using the above-named welding programme featuring 3 levels of force made it possible to achieve 82% of welding lobe extension. Joints made using the multiple current impulse program, where the total welding time was below 1.4s, revealed a change in a peeling mode (to full plug) and an increase in weld tensile shear strength of 10%.Keywords: 22MnB5, hot stamping, interfacial fracture, resistance spot welding, simulation, single lap joint, welding lobe
Procedia PDF Downloads 38627491 The Use of Medical Biotechnology to Treat Genetic Disease
Authors: Rachel Matar, Maxime Merheb
Abstract:
Chemical drugs have been used for many centuries as the only way to cure diseases until the novel gene therapy has been created in 1960. Gene therapy is based on the insertion, correction, or inactivation of genes to treat people with genetic illness (1). Gene therapy has made wonders in Parkison’s, Alzheimer and multiple sclerosis. In addition to great promises in the healing of deadly diseases like many types of cancer and autoimmune diseases (2). This method implies the use of recombinant DNA technology with the help of different viral and non-viral vectors (3). It is nowadays used in somatic cells as well as embryos and gametes. Beside all the benefits of gene therapy, this technique is deemed by some opponents as an ethically unacceptable treatment as it implies playing with the genes of living organisms.Keywords: gene therapy, genetic disease, cancer, multiple sclerosis
Procedia PDF Downloads 54127490 Demographic Variations of Multiple Sclerosis Patients between Britain and Kuwait
Authors: Ali Fuad Ashour
Abstract:
Introduction: Multiple sclerosis (MS) is a chronic, progressive and degenerative disease that affects the central nervous system (CNS). MS has been described to result in the debilitating symptom of the disease. It is reported to have a negative impact on the patient’s mental activities, brings a lower quality of life, leads to unemployment, causes distress and psychological disorders, generates low levels of motivation and self-esteem, and result in disability and neurological impairment. The aim of this study was to compare the effects of MS on patients from Britain and Kuwait. Methodology: A questionnaire was distributed to 200 individuals with MS (100 Kuwaiti and 100 British). The questionnaire consists of three parts; 1. General demographics, 2. Disease-specific data (symptoms, severity levels, relapse frequency, and support system), and 3. Attitudes towards physical exercise. Results: A response rate of 62% from the British sample and 50% from the Kuwaiti sample was achieved. 84% of the sample (n=52) were 41 years old or over. The duration of the disease was less than 10 years in 43.4% of British and 68% of Kuwaiti respondents. The majority of British respondents (56.5%) reported the disease severity to be moderate, while the majority of Kuwaitis was mild (72%). The annual relapse rates in Kuwait were relatively low, with 82% of the Kuwaiti sample had one relapse per year, compared to the 64.5% of British. The most common symptoms reported by British respondents were balance (75.8%), fatigue (74.2%), and weakness (71%), and by Kuwaiti respondents were fatigue (86%), balance (76%), and weakness (66%). The help and support for MS were by far more diverse for the British than Kuwaiti respondents. Discussion: The results unveiled marked differences between two groups of British and Kuwaiti MS patients in terms of patients’ age and disease duration, and severity. The overwhelming majority of Kuwaiti patients are young individuals who have been with the disease for a relatively short period of time, and their MS in most cases was mild. On the other hand, British patients were relatively older, many have been with the disease for a long period of time, and their average MS condition was more serious than that of their Kuwaiti counterparts. The main support in Kuwait comes from the neurologist, who primarily prescribe medications and advise patients to try to be active. The Kuwaiti respondents thought that lack of encouragement was the main reason for them not to engage in social activities.Keywords: multiple sclerosis, Kuwait, exercise, demographic
Procedia PDF Downloads 11827489 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 3227488 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan
Authors: Tasir Khan, Yejuan Wang
Abstract:
The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments
Procedia PDF Downloads 8127487 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 17027486 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data
Procedia PDF Downloads 37827485 Angle of Arrival Estimation Using Maximum Likelihood Method
Authors: Olomon Wu, Hung Lu, Nick Wilkins, Daniel Kerr, Zekeriya Aliyazicioglu, H. K. Hwang
Abstract:
Multiple Input Multiple Output (MIMO) radar has received increasing attention in recent years. MIMO radar has many advantages over conventional phased array radar such as target detection, resolution enhancement, and interference suppression. In this paper, the results are presented from a simulation study of MIMO Uniformly-Spaced Linear Array (ULA) antennas. The performance is investigated under varied parameters, including varied array size, Pseudo Random (PN) sequence length, number of snapshots, and Signal to Noise Ratio (SNR). The results of MIMO are compared to a traditional array antenna.Keywords: MIMO radar, phased array antenna, target detection, radar signal processing
Procedia PDF Downloads 54127484 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review
Procedia PDF Downloads 16227483 Kissing Cervical Spine Schwannomas in a Young Female from a Low Resource Setting: A Case Report
Authors: Joseph Mary Ssembatya, Blessing Michael Taremwa
Abstract:
Background: Multiple schwannomas are typically associated with neurofibromatosis type 1 (NF1), but rare cases occur independently of neurofibromatosis. Schwannomas are benign, slow-growing tumors, primarily affecting the cervical and lumbar spine. When large, they may extend over multiple vertebral levels, posing surgical challenges. Case Presentation: A 13-year-old Ugandan Munyankore female patient, presented with a 6-year history of progressive quadriparesis, particularly in the lower limbs. Clinical examination showed hypertonia and hyperreflexia, with no indicators of neurofibromatosis or prior trauma. MRI revealed two “kissing” schwannomas extending from C2 to T2 in the cervical spine. Decompressive surgery was performed through laminoplasty and partial lesion resection, and histology confirmed schwannoma. Two weeks postoperatively, the patient experienced cerebrospinal fluid (CSF) leakage, neck pain, and headache, which required re-operation and duraplasty. Following these interventions, the patient’s neurological status stabilized, with noted improvement in lower limb strength. Discussion: “Kissing” schwannomas are most frequently documented in the cerebellopontine angle, rarely in the spine, and even more rarely in children. While multiple schwannomas are often associated with NF2, this case had no family history or clinical signs of the disorder. Giant invasive spinal schwannomas (GISS) that span multiple vertebrae demand intricate surgical approaches due to their proximity to neurovascular structures. Conclusion: This is the first reported case of kissing cervical schwannomas in a young patient from a low- to middle-income country. Surgical decompression, though challenging, is critical for neurological recovery in such advanced cases.Keywords: kissing schwannoma, cervical spine, low resource, young, uganda
Procedia PDF Downloads 1327482 Accounting Knowledge Management and Value Creation of SME in Chatuchak Market: Case Study Ceramics Product
Authors: Runglaksamee Rodkam
Abstract:
The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses. The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.Keywords: influence, potential performance, success, working process
Procedia PDF Downloads 25627481 Factors Predicting Preventive Behavior for Osteoporosis in University Students
Authors: Thachamon Sinsoongsud, Noppawan Piaseu
Abstract:
This predictive study was aimed to 1) describe self efficacy for risk reduction and preventive behavior for osteoporosis, and 2) examine factors predicting preventive behavior for osteoporosis in nursing students. Through purposive sampling, the sample included 746 nursing students in a public university in Bangkok, Thailand. Data were collected by a self-reported questionnaire on self efficacy and preventive behavior for osteoporosis. Data were analyzed using descriptive statistics and multiple regression analysis with stepwise method. Results revealed that majority of the students were female (98.3%) with mean age of 19.86 + 1.26 years. The students had self efficacy and preventive behavior for osteoporosis at moderate level. Self efficacy and level of education could together predicted 35.2% variance of preventive behavior for osteoporosis (p< .001). Results suggest approaches for promoting preventive behavior for osteoporosis through enhancing self efficacy among nursing students in a public university in Bangkok, Thailand.Keywords: osteoporosis, self-efficacy, preventive behavior, nursing students
Procedia PDF Downloads 37827480 Threat Analysis: A Technical Review on Risk Assessment and Management of National Testing Service (NTS)
Authors: Beenish Urooj, Ubaid Ullah, Sidra Riasat
Abstract:
National Testing Service-Pakistan (NTS) is an agency in Pakistan that conducts student success appraisal examinations. In this research paper, we must present a security model for the NTS organization. The security model will depict certain security countermeasures for a better defense against certain types of breaches and system malware. We will provide a security roadmap, which will help the company to execute its further goals to maintain security standards and policies. We also covered multiple aspects in securing the environment of the organization. We introduced the processes, architecture, data classification, auditing approaches, survey responses, data handling, and also training and awareness of risk for the company. The primary contribution is the Risk Survey, based on the maturity model meant to assess and examine employee training and knowledge of risks in the company's activities.Keywords: NTS, risk assessment, threat factors, security, services
Procedia PDF Downloads 7027479 Lennox-gastaut Syndrome Associated with Dysgenesis of Corpus Callosum
Authors: A. Bruce Janati, Muhammad Umair Khan, Naif Alghassab, Ibrahim Alzeir, Assem Mahmoud, M. Sammour
Abstract:
Rationale: Lennox-Gastaut syndrome(LGS) is an electro-clinical syndrome composed of the triad of mental retardation, multiple seizure types, and the characteristic generalized slow spike-wave complexes in the EEG. In this article, we report on two patients with LGS whose brain MRI showed dysgenesis of corpus callosum(CC). We review the literature and stress the role of CC in the genesis of secondary bilateral synchrony(SBS). Method: This was a clinical study conducted at King Khalid Hospital. Results: The EEG was consistent with LGS in patient 1 and unilateral slow spike-wave complexes in patient 2. The MRI showed hypoplasia of the splenium of CC in patient 1, and global hypoplasia of CC combined with Joubert syndrome in patient 2. Conclusion: Based on the data, we proffer the following hypotheses: 1-Hypoplasia of CC interferes with functional integrity of this structure. 2-The genu of CC plays a pivotal role in the genesis of secondary bilateral synchrony. 3-Electrodecremental seizures in LGS emanate from pacemakers generated in the brain stem, in particular the mesencephalon projecting abnormal signals to the cortex via thalamic nuclei. 4-Unilateral slow spike-wave complexes in the context of mental retardation and multiple seizure types may represent a variant of LGS, justifying neuroimaging studies.Keywords: EEG, Lennox-Gastaut syndrome, corpus callosum , MRI
Procedia PDF Downloads 44627478 The Effect of Peer Pressure and Leisure Boredom on Substance Use Among Adolescents in Low-Income Communities in Capetown
Authors: Gaironeesa Hendricks, Shazly Savahl, Maria Florence
Abstract:
The aim of the study is to determine whether peer pressure and leisure boredom influence substance use among adolescents in low-income communities in Cape Town. Non-probability sampling was used to select 296 adolescents between the ages of 16–18 from schools located in two low-income communities. The measurement tools included the Drug Use Disorders Identification Test, the Resistance to Peer Influence and Leisure Boredom Scales. Multiple regression revealed that the combined influence of peer pressure and leisure boredom predicted substance use, while peer pressure emerged as a stronger predictor than leisure boredom on substance use among adolescents.Keywords: substance use, peer pressure, leisure boredom, adolescents, multiple regression
Procedia PDF Downloads 598