Search results for: statistical arbitrage
3652 Estimating Knowledge Flow Patterns of Business Method Patents with a Hidden Markov Model
Authors: Yoonjung An, Yongtae Park
Abstract:
Knowledge flows are a critical source of faster technological progress and stouter economic growth. Knowledge flows have been accelerated dramatically with the establishment of a patent system in which each patent is required by law to disclose sufficient technical information for the invention to be recreated. Patent analysis, thus, has been widely used to help investigate technological knowledge flows. However, the existing research is limited in terms of both subject and approach. Particularly, in most of the previous studies, business method (BM) patents were not covered although they are important drivers of knowledge flows as other patents. In addition, these studies usually focus on the static analysis of knowledge flows. Some use approaches that incorporate the time dimension, yet they still fail to trace a true dynamic process of knowledge flows. Therefore, we investigate dynamic patterns of knowledge flows driven by BM patents using a Hidden Markov Model (HMM). An HMM is a popular statistical tool for modeling a wide range of time series data, with no general theoretical limit in regard to statistical pattern classification. Accordingly, it enables characterizing knowledge patterns that may differ by patent, sector, country and so on. We run the model in sets of backward citations and forward citations to compare the patterns of knowledge utilization and knowledge dissemination.Keywords: business method patents, dynamic pattern, Hidden-Markov Model, knowledge flow
Procedia PDF Downloads 3283651 Effects of Process Parameter Variation on the Surface Roughness of Rapid Prototyped Samples Using Design of Experiments
Authors: R. Noorani, K. Peerless, J. Mandrell, A. Lopez, R. Dalberto, M. Alzebaq
Abstract:
Rapid prototyping (RP) is an additive manufacturing technology used in industry that works by systematically depositing layers of working material to construct larger, computer-modeled parts. A key challenge associated with this technology is that RP parts often feature undesirable levels of surface roughness for certain applications. To combat this phenomenon, an experimental technique called Design of Experiments (DOE) can be employed during the growth procedure to statistically analyze which RP growth parameters are most influential to part surface roughness. Utilizing DOE to identify such factors is important because it is a technique that can be used to optimize a manufacturing process, which saves time, money, and increases product quality. In this study, a four-factor/two level DOE experiment was performed to investigate the effect of temperature, layer thickness, infill percentage, and infill speed on the surface roughness of RP prototypes. Samples were grown using the sixteen different possible growth combinations associated with a four-factor/two level study, and then the surface roughness data was gathered for each set of factors. After applying DOE statistical analysis to these data, it was determined that layer thickness played the most significant role in the prototype surface roughness.Keywords: rapid prototyping, surface roughness, design of experiments, statistical analysis, factors and levels
Procedia PDF Downloads 2623650 Premalignant and Malignant Lesions of Uterine Polyps: Analysis at a University Hospital
Authors: Manjunath A. P., Al-Ajmi G. M., Al Shukri M., Girija S
Abstract:
Introduction: This study aimed to compare the ability of hysteroscopy and ultrasonography to diagnose uterine polyps. To correlate the ultrasonography and hystroscopic findings with various clinical factors and histopathology of uterine polyps. Methods: This is a retrospective study conducted at the Department of Obstetrics and Gynaecology at Sultan Qaboos University Hospital from 2014 to 2019. All women undergoing hysteroscopy for suspected uterine polyps were included. All relevant data were obtained from the electronic patient record and analysed using SPSS. Results: A total of 77 eligible women were analysed. The mean age of the patients was 40 years. The clinical risk factors; obesity, hypertension, and diabetes mellitus, showed no significant statistical association with the presence of uterine polyps (p-value>0.005). Although 20 women (52.6%) with uterine polyps had thickened endometrium (>11 mm), however, there is no statistical association (p-value>0.005). The sensitivity and specificity of ultrasonography in the detection of uterine polyp were 39% and 65%, respectively. Whereas for hysteroscopy, it was 89% and 20%, respectively. The prevalence of malignant and premalignant lesions were 1.85% and 7.4%, respectively. Conclusion: This study found that obesity, hypertension, and diabetes mellitus were not associated with the presence of uterine polyps. There was no association between thick endometrium and uterine polyps. The sensitivity is higher for hysteroscopy, whereas the specificity is higher for sonography in detecting uterine polyps. The prevalence of malignancy was very low in uterine polyps.Keywords: endometrial polyps, hysteroscopy, ultrasonography, premalignant, malignant
Procedia PDF Downloads 1293649 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis
Authors: N. R. N. Idris, S. Baharom
Abstract:
A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.Keywords: aggregate data, combined-level data, individual patient data, meta-analysis
Procedia PDF Downloads 3753648 Thermal Behavior of a Ventilated Façade Using Perforated Ceramic Bricks
Authors: Helena López-Moreno, Antoni Rodríguez-Sánchez, Carmen Viñas-Arrebola, Cesar Porras-Amores
Abstract:
The ventilated façade has great advantages when compared to traditional façades as it reduces the air conditioning thermal loads due to the stack effect induced by solar radiation in the air chamber. Optimizing energy consumption by using a ventilated façade can be used not only in newly built buildings but also it can be implemented in existing buildings, opening the field of implementation to energy building retrofitting works. In this sense, the following three prototypes of façade where designed, built and further analyzed in this research: non-ventilated façade (NVF); slightly ventilated façade (SLVF) and strongly ventilated façade (STVF). The construction characteristics of the three facades are based on the Spanish regulation of building construction “Technical Building Code”. The façades have been monitored by type-k thermocouples in a representative day of the summer season in Madrid (Spain). Moreover, an analysis of variance (ANOVA) with repeated measures, studying the thermal lag in the ventilated and no-ventilated façades has been designed. Results show that STVF façade presents higher levels of thermal inertia as the thermal lag reduces up to 100% (daily mean) compared to the non-ventilated façade. In addition, the statistical analysis proves that an increase of the ventilation holes size in STVF façades does not improve the thermal lag significantly (p > 0.05) when compared to the SLVF façade.Keywords: ventilated façade, energy efficiency, thermal behavior, statistical analysis
Procedia PDF Downloads 4923647 Group Sequential Covariate-Adjusted Response Adaptive Designs for Survival Outcomes
Authors: Yaxian Chen, Yeonhee Park
Abstract:
Driven by evolving FDA recommendations, modern clinical trials demand innovative designs that strike a balance between statistical rigor and ethical considerations. Covariate-adjusted response-adaptive (CARA) designs bridge this gap by utilizing patient attributes and responses to skew treatment allocation in favor of the treatment that is best for an individual patient’s profile. However, existing CARA designs for survival outcomes often hinge on specific parametric models, constraining their applicability in clinical practice. In this article, we address this limitation by introducing a CARA design for survival outcomes (CARAS) based on the Cox model and a variance estimator. This method addresses issues of model misspecification and enhances the flexibility of the design. We also propose a group sequential overlapweighted log-rank test to preserve type I error rate in the context of group sequential trials using extensive simulation studies to demonstrate the clinical benefit, statistical efficiency, and robustness to model misspecification of the proposed method compared to traditional randomized controlled trial designs and response-adaptive randomization designs.Keywords: cox model, log-rank test, optimal allocation ratio, overlap weight, survival outcome
Procedia PDF Downloads 643646 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes
Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini
Abstract:
Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.Keywords: modelling, Monte Carlo simulations, probabilistic models, data clustering, reinforced concrete members, structural design
Procedia PDF Downloads 4723645 A Machine Learning Approach for Anomaly Detection in Environmental IoT-Driven Wastewater Purification Systems
Authors: Giovanni Cicceri, Roberta Maisano, Nathalie Morey, Salvatore Distefano
Abstract:
The main goal of this paper is to present a solution for a water purification system based on an Environmental Internet of Things (EIoT) platform to monitor and control water quality and machine learning (ML) models to support decision making and speed up the processes of purification of water. A real case study has been implemented by deploying an EIoT platform and a network of devices, called Gramb meters and belonging to the Gramb project, on wastewater purification systems located in Calabria, south of Italy. The data thus collected are used to control the wastewater quality, detect anomalies and predict the behaviour of the purification system. To this extent, three different statistical and machine learning models have been adopted and thus compared: Autoregressive Integrated Moving Average (ARIMA), Long Short Term Memory (LSTM) autoencoder, and Facebook Prophet (FP). The results demonstrated that the ML solution (LSTM) out-perform classical statistical approaches (ARIMA, FP), in terms of both accuracy, efficiency and effectiveness in monitoring and controlling the wastewater purification processes.Keywords: environmental internet of things, EIoT, machine learning, anomaly detection, environment monitoring
Procedia PDF Downloads 1513644 Cognitions of Physical Education Supervisors and Teachers for Conceptions of Effective Teaching Related to the Concerns Theory
Authors: Ali M. Alsagheir
Abstract:
Effective teaching is concerned to be one of the research fields of teaching, and its fundamental case is to reach the most successful ways that makes teaching fruitful. Undoubtedly, these methods are common factors between all parties who are concerned with the educational process such as instructors, directors, parents, and others. This study had aimed to recognize the cognitions of physical education supervisors and teachers for conceptions of effective teaching according to the interests theory. A questionnaire was used to collect data of the study; the sample contained 230 teachers and supervisors.The results were ended in: that the average of conceptions of effective teaching expressions for the sample of the study decreases at the progress through stages of teaching development in general. The study showed the absence of statistical indicator between teachers and supervisors at the core of both teaching principals and teaching tasks although the results showed that there are statistical indicators at the core of teaching achievements between supervisors and teachers in favor of supervisors. The study ended in to recommendations which can share in increasing the effectiveness of teaching such as: putting clear and specific standards for the effectiveness of teaching in which teacher's performance is based, constructing practical courses that focus on bringing on both supervisors and teachers with skills and strategies of effectiveness teaching, taking care of children achievement as an important factor and a strong indicator on effectiveness of teaching and learning.Keywords: concerns theory, effective teaching, physical education, supervisors, teachers
Procedia PDF Downloads 4103643 Difference Between Planning Target Volume (PTV) Based Slow-Ct and Internal Target Volume (ITV) Based 4DCT Imaging Techniques in Stereotactic Body Radiotherapy for Lung Cancer: A Comparative Study
Authors: Madhumita Sahu, S. S. Tiwary
Abstract:
The Radiotherapy of Carcinoma Lung has always been difficult and a matter of great concern. The significant movement due to fractional motion caused due to non-rhythmic respiratory motion poses a great challenge for the treatment of Lung cancer using Ionizing Radiation. The present study compares the accuracy in the measurement of Target Volume using Slow-CT and 4DCT Imaging in SBRT for Lung Tumor. The experimental samples were extracted from patients with Lung Cancer who underwent SBRT. Slow-CT and 4DCT images were acquired under free breathing for each patient. PTV were delineated on Slow CT images. Similarly, ITV was also delineated on each of the 4DCT volumes. Volumetric and Statistical analysis were performed for each patient by measuring corresponding PTV and ITV volumes. The study showed (1) The Maximum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 248.58 cc. (2) The Minimum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 5.22 cc. (3) The Mean Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 63.21 cc. The present study concludes that irradiated volume ITV with 4DCT is less as compared to the PTV with Slow-CT. A better and more precise treatment could be given more accurately with 4DCT Imaging by sparing 63.21 CC of mean body volume.Keywords: CT imaging, 4DCT imaging, lung cancer, statistical analysis
Procedia PDF Downloads 243642 Exploring Fertility Dynamics in the MENA Region: Distribution, Determinants, and Temporal Trends
Authors: Dena Alhaloul
Abstract:
The Middle East and North Africa (MENA) region is characterized by diverse cultures, economies, and social structures. Fertility rates in MENA have seen significant changes over time, with variations among countries and subregions. Understanding fertility patterns in this region is essential due to its impact on demographic dynamics, healthcare, labor markets, and social policies. Rising or declining fertility rates have far-reaching consequences for the region's socioeconomic development. The main thrust of this study is to comprehensively examine fertility rates in the Middle East and North Africa (MENA) region. It aims to understand the distribution, determinants, and temporal trends of fertility rates in MENA countries. The study seeks to provide insights into the factors influencing fertility decisions, assess how fertility rates have evolved over time, and potentially develop statistical models to characterize these trends. As for the methodology of the study, the study uses descriptive statistics to summarize and visualize fertility rate data. It also uses regression analyses to identify determinants of fertility rates as well as statistical modeling to characterize temporal trends in fertility rates. The conclusion of this study The research will contribute to a deeper understanding of fertility dynamics in the MENA region, shedding light on the distribution of fertility rates, their determinants, and historical trends.Keywords: fertility, distribution, modeling, regression
Procedia PDF Downloads 813641 The Evaluation of Complete Blood Cell Count-Based Inflammatory Markers in Pediatric Obesity and Metabolic Syndrome
Authors: Mustafa M. Donma, Orkide Donma
Abstract:
Obesity is defined as a severe chronic disease characterized by a low-grade inflammatory state. Therefore, inflammatory markers gained utmost importance during the evaluation of obesity and metabolic syndrome (MetS), a disease characterized by central obesity, elevated blood pressure, increased fasting blood glucose and elevated triglycerides or reduced high density lipoprotein cholesterol (HDL-C) values. Some inflammatory markers based upon complete blood cell count (CBC) are available. In this study, it was questioned which inflammatory marker was the best to evaluate the differences between various obesity groups. 514 pediatric individuals were recruited. 132 children with MetS, 155 morbid obese (MO), 90 obese (OB), 38 overweight (OW) and 99 children with normal BMI (N-BMI) were included into the scope of this study. Obesity groups were constituted using age- and sex-dependent body mass index (BMI) percentiles tabulated by World Health Organization. MetS components were determined to be able to specify children with MetS. CBC were determined using automated hematology analyzer. HDL-C analysis was performed. Using CBC parameters and HDL-C values, ratio markers of inflammation, which cover neutrophil-to-lymphocyte ratio (NLR), derived neutrophil-to-lymphocyte ratio (dNLR), platelet-to-lymphocyte ratio (PLR), lymphocyte-to-monocyte ratio (LMR), monocyte-to-HDL-C ratio (MHR) were calculated. Statistical analyses were performed. The statistical significance degree was considered as p < 0.05. There was no statistically significant difference among the groups in terms of platelet count, neutrophil count, lymphocyte count, monocyte count, and NLR. PLR differed significantly between OW and N-BMI as well as MetS. Monocyte-to HDL-C value exhibited statistical significance between MetS and N-BMI, OB, and MO groups. HDL-C value differed between MetS and N-BMI, OW, OB, MO groups. MHR was the ratio, which exhibits the best performance among the other CBC-based inflammatory markers. On the other hand, when MHR was compared to HDL-C only, it was suggested that HDL-C has given much more valuable information. Therefore, this parameter still keeps its value from the diagnostic point of view. Our results suggest that MHR can be an inflammatory marker during the evaluation of pediatric MetS, but the predictive value of this parameter was not superior to HDL-C during the evaluation of obesity.Keywords: children, complete blood cell count, high density lipoprotein cholesterol, metabolic syndrome, obesity
Procedia PDF Downloads 1293640 Analysis of Operating Speed on Four-Lane Divided Highways under Mixed Traffic Conditions
Authors: Chaitanya Varma, Arpan Mehar
Abstract:
The present study demonstrates the procedure to analyse speed data collected on various four-lane divided sections in India. Field data for the study was collected at different straight and curved sections on rural highways with the help of radar speed gun and video camera. The data collected at the sections were analysed and parameters pertain to speed distributions were estimated. The different statistical distribution was analysed on vehicle type speed data and for mixed traffic speed data. It was found that vehicle type speed data was either follows the normal distribution or Log-normal distribution, whereas the mixed traffic speed data follows more than one type of statistical distribution. The most common fit observed on mixed traffic speed data were Beta distribution and Weibull distribution. The separate operating speed model based on traffic and roadway geometric parameters were proposed in the present study. The operating speed model with traffic parameters and curve geometry parameters were established. Two different operating speed models were proposed with variables 1/R and Ln(R) and were found to be realistic with a different range of curve radius. The models developed in the present study are simple and realistic and can be used for forecasting operating speed on four-lane highways.Keywords: highway, mixed traffic flow, modeling, operating speed
Procedia PDF Downloads 4603639 Role of DatScan in the Diagnosis of Parkinson's Disease
Authors: Shraddha Gopal, Jayam Lazarus
Abstract:
Aims: To study the referral practice and impact of DAT-scan in the diagnosis or exclusion of Parkinson’s disease. Settings and Designs: A retrospective study Materials and methods: A retrospective study of the results of 60 patients who were referred for a DAT scan over a period of 2 years from the Department of Neurology at Northern Lincolnshire and Goole NHS trust. The reason for DAT scan referral was noted under 5 categories against Parkinson’s disease; drug-induced Parkinson’s, essential tremors, diagnostic dilemma, not responding to Parkinson’s treatment, and others. We assessed the number of patients who were diagnosed with Parkinson’s disease against the number of patients in whom Parkinson’s disease was excluded or an alternative diagnosis was made. Statistical methods: Microsoft Excel was used for data collection and statistical analysis, Results: 30 of the 60 scans were performed to confirm the diagnosis of early Parkinson’s disease, 13 were done to differentiate essential tremors from Parkinsonism, 6 were performed to exclude drug-induced Parkinsonism, 5 were done to look for alternative diagnosis as the patients were not responding to anti-Parkinson medication and 6 indications were outside the recommended guidelines. 55% of cases were confirmed with a diagnosis of Parkinson’s disease. 43.33% had Parkinson’s disease excluded. 33 of the 60 scans showed bilateral abnormalities and confirmed the clinical diagnosis of Parkinson’s disease. Conclusion: DAT scan provides valuable information in confirming Parkinson’s disease in 55% of patients along with excluding the diagnosis in 43.33% of patients aiding an alternative diagnosis.Keywords: DATSCAN, Parkinson's disease, diagnosis, essential tremors
Procedia PDF Downloads 2323638 Comparision of Statistical Variables for Vaccinated and Unvaccinated Children in Measles Cases in Khyber Pukhtun Khwa
Authors: Inayatullah Khan, Afzal Khan, Hamzullah Khan, Afzal Khan
Abstract:
Objectives: The objective of this study was to compare different statistical variables for vaccinated and unvaccinated children in measles cases. Material and Methods: This cross sectional comparative study was conducted at Isolation ward, Department of Paediatrics, Lady Reading Hospital (LRH), Peshawar, from April 2012 to March 2013. A total of 566 admitted cases of measles were enrolled. Data regarding age, sex, address, vaccination status, measles contact, hospital stay and outcome was collected and recorded on a proforma. History of measles vaccination was ascertained either by checking the vaccination cards or on parental recall. Result: In 566 cases of measles, 211(39%) were vaccinated and 345 (61%) were unvaccinated. Three hundred and ten (54.80%) patients were males and 256 (45.20%) were females with a male to female ratio of 1.2:1.The age range was from 1 year to 14 years with mean age with SD of 3.2 +2 years. Majority (371, 65.5%) of the patients were 1-3 years old. Mean hospital stay was 3.08 days with a range of 1-10 days and a standard deviation of ± 1.15. History of measles contact was present in 393 (69.4%) cases. Fourty eight patients were expired with a mortality rate of 8.5%. Conclusion: Majority of the children in Khyber Pukhtunkhwa are unvaccinated and unprotected against measles. Among vaccinated children, 39% of children attracted measles which indicate measles vaccine failure. This figure is clearly higher than that accepted for measles vaccine (2-10%).Keywords: measles, vaccination, immunity, population
Procedia PDF Downloads 4443637 Transformation of Health Communication Literacy in Information Technology during Pandemic in 2019-2022
Authors: K. Y. S. Putri, Heri Fathurahman, Yuki Surisita, Widi Sagita, Kiki Dwi Arviani
Abstract:
Society needs the assistance of academics in understanding and being skilled in health communication literacy. Information technology runs very fast while health communication literacy skills in getting health communication information during the pandemic are not as fast as the development of information technology. The research question is whether there is an influence of health communication on information technology in health information during the pandemic in Indonesia. The purpose of the study is to find out the influence of health communication on information technology in health information during the pandemic in Indonesia. The concepts of health communication literacy and information technology are used this study. Previous research is in support of this study. Quantitative research methods by disseminating questionnaires in this study. The validity and reliability test of this study is positive, so it can proceed to the next statistical analysis. Descriptive results of variable health communication literacy are of positive value in all dimensions. All dimensions of information technology are of positive value. Statistical tests of the influence of health communication literacy on information technology are of great value. Discussion of both variables in the influence of health communication literacy and high-value information technology because health communication literacy has a high effect in information technology. Respondents to this study have high information technology skills. So that health communication literacy in obtaining health information during the 2019-2022 pandemic is needed. Research advice is that academics are still very much needed by the community in the development of society during the pandemic.Keywords: health information, health information needs, literacy health communication, information technology
Procedia PDF Downloads 1403636 Examining the Attitudes of Pre-School Teachers towards Values Education in Terms of Gender, School Type, Professional Seniority and Location
Authors: Hatice Karakoyun, Mustafa Akdag
Abstract:
This study has been made to examine the attitudes of pre-school teachers towards values education. The study has been made as a general scanning model. The study’s working group contains 108 pre-school teachers who worked in Diyarbakır, Turkey. In this study Values Education Attitude Scale (VEAS), which developed by Yaşaroğlu (2014), was used. In order to analyze the data for sociodemographic structure, percentage and frequency values were examined. The Kolmogorov-Smirnov method was used in determination of the normal distribution of data. During analyzing the data, KolmogorovSimirnov test and the normal curved histograms were examined to determine which statistical analyzes would be applied on the scale and it was found that the distribution was not normal. Thus, the Mann Whitney U analysis technique which is one of the nonparametric statistical analysis techniques were used to test the difference of the scores obtained from the scale in terms of independent variables. According to the analyses, it seems that pre-school teachers’ attitudes toward values education are positive. According to the scale with the highest average, it points out that pre-school teachers think that values education is very important for students’ and children’s future. The variables included in the scale (gender, seniority, age group, education, school type, school place) seem to have no effect on the pre-school teachers’ attitude grades which joined to the study.Keywords: attitude scale, pedagogy, pre-school teacher, values education
Procedia PDF Downloads 2483635 A Semantic and Concise Structure to Represent Human Actions
Authors: Tobias Strübing, Fatemeh Ziaeetabar
Abstract:
Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis
Procedia PDF Downloads 1263634 Blue Eyes and Blonde Hair in Mass Media: A News Discourse Analysis of Western Media on the News Coverage of Ukraine
Authors: Zahra Mehrabbeygi
Abstract:
This research is opted to analyze and survey discourse variety and news image-making in western media regarding the news coverage of the Russian army intrusion into Ukraine. This research will be done on the news coverage of Ukraine in a period from February 2022 to May 2022 in five western media, "BBC, CBS, NBC, Al Jazeera, and Telegraph." This research attempts to discover some facts about the news policies of the five western news agencies during the circumstances of the Ukraine-Russia war. Critical theories in the news, such as Framing, Media Imperialism of News, Image Making, Discourse, and Ideology, were applied to achieve this goal. The research methodology uses Van Dijk's discourse exploration method based on discourse analysis. The research's statistical population is related to all the news about racial discrimination during the mentioned period. After a statistical population survey with Targeted Sampling, the researcher randomly selected ten news cases for exploration. The research findings show that the western media have similarities in their texts via lexical items, polarization, citations, persons, and institutions. The research findings also imply pre-suppositions, connotations, and components of consensus agreement and underlying predicates in the outset, middle, and end events. The reaction of some western media not only shows their bewilderment but also exposes their prejudices rooted in racism.Keywords: news discourse analysis, western media, racial discrimination, Ukraine-Russia war
Procedia PDF Downloads 973633 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System
Authors: Hassan Qandil
Abstract:
Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar
Procedia PDF Downloads 1553632 Investigating the Relationship between Place Attachment and Sustainable Development of Urban Spaces
Authors: Hamid Reza Zeraatpisheh, Ali Akbar Heidari, Soleiman Mohammadi Doust
Abstract:
This study has examined the relationship between place attachment and sustainable development of urban spaces. To perform this, the components of place identity, emotional attachment, place attachment and social bonding which totally constitute the output of place attachment, by means of the standardized questionnaire measure place attachment in three domains of (cognitive) the place identity, (affective) emotional attachment and (behavioral) place attachment and social bonding. To measure sustainable development, three components of sustainable development, including society, economy and environment has been considered. The study is descriptive. The assessment instrument is the standard questionnaire of Safarnia which has been used to measure the variable of place attachment and to measure the variable of sustainable development, a questionnaire has been made by the researcher and been based on the combined theoretical framework. The statistical population of this research has been the city of Shiraz. The statistical sample has been Hafeziyeh. SPSS software has been used to analyze the data and examined the results of both descriptive and inferential statistics. In inferential statistics, Pearson correlation coefficient has been used to examine the hypotheses. In this study, the variable of place attachment is high and sustainable development is also in a high level. These results suggest a positive relationship between attachment to place and sustainable development.Keywords: place attachment, sustainable development, economy-society-environment, Hafez's tomb
Procedia PDF Downloads 7013631 Quantitative Structure–Activity Relationship Analysis of Some Benzimidazole Derivatives by Linear Multivariate Method
Authors: Strahinja Z. Kovačević, Lidija R. Jevrić, Sanja O. Podunavac Kuzmanović
Abstract:
The relationship between antibacterial activity of eighteen different substituted benzimidazole derivatives and their molecular characteristics was studied using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on inhibitory activity towards Staphylococcus aureus, by using molecular descriptors, as well as minimal inhibitory activity (MIC). Molecular descriptors were calculated from the optimized structures. Principal component analysis (PCA) followed by hierarchical cluster analysis (HCA) and multiple linear regression (MLR) was performed in order to select molecular descriptors that best describe the antibacterial behavior of the compounds investigated, and to determine the similarities between molecules. The HCA grouped the molecules in separated clusters which have the similar inhibitory activity. PCA showed very similar classification of molecules as the HCA, and displayed which descriptors contribute to that classification. MLR equations, that represent MIC as a function of the in silico molecular descriptors were established. The statistical significance of the estimated models was confirmed by standard statistical measures and cross-validation parameters (SD = 0.0816, F = 46.27, R = 0.9791, R2CV = 0.8266, R2adj = 0.9379, PRESS = 0.1116). These parameters indicate the possibility of application of the established chemometric models in prediction of the antibacterial behaviour of studied derivatives and structurally very similar compounds.Keywords: antibacterial, benzimidazole, molecular descriptors, QSAR
Procedia PDF Downloads 3643630 The 1st Personal Pronouns as Evasive Devices in the 2016 Taiwanese Presidential Debate
Authors: Yan-Chi Chen
Abstract:
This study aims to investigate the 1st personal pronouns as evasive devices used by presidential candidates in the 2016 Taiwanese Presidential Debate within the framework of critical discourse analysis (CDA). This study finds that the personal pronoun ‘I’ is the highest frequent personal pronoun in the 2016 Taiwanese Presidential Debate. Generally speaking, the first personal pronouns were used most in the presidential debate, compared with the second and the third personal pronouns. Hence, a further quantitative analysis is conducted to explore the correlation between the frequencies of the two 1st personal pronouns and the other pronouns. Results show that the number of the personal pronoun ‘I’ increases from 26 to 49, with the personal pronoun ‘we’ decreases from 43 to 15 during the debate. Though it seems the personal pronoun ‘I’ has a higher tendency in pronominal choice, statistical evidence demonstrated that the personal pronoun ‘we’ has the greater statistical significance (p<0.0002), compared with that of ‘I’ (p<0.0116). The comparatively small p-value of the personal pronoun ‘we’ means it ‘has a stronger correlation with the overall pronominal choice, and the personal pronoun ‘we’ is more likely to be used than the personal pronoun ‘I’. Therefore, this study concludes that the pronominal choice varies with different evasive strategies. The ingrained functions of these personal pronouns are mainly categorized as ‘agreement’ and ‘justification’. The personal pronoun ’we’ is preferred in the agreement evasive strategies, and ‘I’ is used for justifying oneself. In addition, the personal pronoun ‘we’ can be defined as both ‘inclusive’ and ‘exclusive’ personal pronoun, which rendered ‘we’ more functions not limited to agreement evasive strategies. In conclusion, although the personal pronoun ‘I’ has the highest occurrences, the personal pronoun ‘we’ is more related to the first pronoun choices.Keywords: critical discourse analysis (CDA), evasive devices, the 1st personal pronouns, the 2016 Taiwanese Presidential Debate
Procedia PDF Downloads 1653629 Detecting Earnings Management via Statistical and Neural Networks Techniques
Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie
Abstract:
Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange
Procedia PDF Downloads 4223628 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks
Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox
Abstract:
miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network
Procedia PDF Downloads 5113627 On the Fourth-Order Hybrid Beta Polynomial Kernels in Kernel Density Estimation
Authors: Benson Ade Eniola Afere
Abstract:
This paper introduces a family of fourth-order hybrid beta polynomial kernels developed for statistical analysis. The assessment of these kernels' performance centers on two critical metrics: asymptotic mean integrated squared error (AMISE) and kernel efficiency. Through the utilization of both simulated and real-world datasets, a comprehensive evaluation was conducted, facilitating a thorough comparison with conventional fourth-order polynomial kernels. The evaluation procedure encompassed the computation of AMISE and efficiency values for both the proposed hybrid kernels and the established classical kernels. The consistently observed trend was the superior performance of the hybrid kernels when compared to their classical counterparts. This trend persisted across diverse datasets, underscoring the resilience and efficacy of the hybrid approach. By leveraging these performance metrics and conducting evaluations on both simulated and real-world data, this study furnishes compelling evidence in favour of the superiority of the proposed hybrid beta polynomial kernels. The discernible enhancement in performance, as indicated by lower AMISE values and higher efficiency scores, strongly suggests that the proposed kernels offer heightened suitability for statistical analysis tasks when compared to traditional kernels.Keywords: AMISE, efficiency, fourth-order Kernels, hybrid Kernels, Kernel density estimation
Procedia PDF Downloads 703626 Frame to Frameless: Stereotactic Operation Progress in Robot Time
Authors: Zengmin Tian, Bin Lv, Rui Hui, Yupeng Liu, Chuan Wang, Qing Liu, Hongyu Li, Yan Qi, Li Song
Abstract:
Objective Robot was used for replacement of the frame in recent years. The paper is to investigate the safety and effectiveness of frameless stereotactic surgery in the treatment of children with cerebral palsy. Methods Clinical data of 425 children with spastic cerebral palsy were retrospectively analyzed. The patients were treated with robot-assistant frameless stereotactic surgery of nuclear mass destruction. The motor function was evaluated by gross motor function measure-88 (GMFM-88) before the operation, 1 week and 3 months after the operation respectively. The statistical analysis was performed. Results The postoperative CT showed that the destruction area covered the predetermined target in all the patients. Minimal bleeding of puncture channel occurred in 2 patient, and mild fever in 3 cases. Otherwise, there was no severe surgical complication occurred. The GMFM-88 scores were 49.1±22.5 before the operation, 52.8±24.2 and 64.2±21.4 at the time of 1 week and 3 months after the operation, respectively. There was statistical difference between before and after the operation (P<0.01). After 3 months, the total effective rate was 98.1%, and the average improvement rate of motor function was 24.3% . Conclusion Replaced the traditional frame, the robot-assistant frameless stereotactic surgery is safe and reliable for children with spastic cerebral palsy, which has positive significance in improving patients’ motor function.Keywords: cerebral palsy, robotics, stereotactic techniques, frameless operation
Procedia PDF Downloads 883625 Effect of White Roofing on Refrigerated Buildings
Authors: Samuel Matylewicz, K. W. Goossen
Abstract:
The deployment of white or cool (high albedo) roofing is a common energy savings recommendation for a variety of buildings all over the world. Here, the effect of a white roof on the energy savings of an ice rink facility in the northeastern US is determined by measuring the effect of solar irradiance on the consumption of the rink's ice refrigeration system. The consumption of the refrigeration system was logged over a year, along with multiple weather vectors, and a statistical model was applied. The experimental model indicates that the expected savings of replacing the existing grey roof with a white roof on the consumption of the refrigeration system is only 4.7 %. This overall result of the statistical model is confirmed with isolated instances of otherwise similar weather days, but cloudy vs. sunny, where there was no measurable difference in refrigeration consumption up to the noise in the local data, which was a few percent. This compares with a simple theoretical calculation that indicates 30% savings. The difference is attributed to a lack of convective cooling of the roof in the theoretical model. The best experimental model shows a relative effect of the weather vectors dry bulb temperature, solar irradiance, wind speed, and relative humidity on refrigeration consumption of 1, 0.026, 0.163, and -0.056, respectively. This result can have an impact on decisions to apply white roofing to refrigerated buildings in general.Keywords: cool roofs, solar cooling load, refrigerated buildings, energy-efficient building envelopes
Procedia PDF Downloads 1293624 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 3093623 The Importance of Knowledge Innovation for External Audit on Anti-Corruption
Authors: Adel M. Qatawneh
Abstract:
This paper aimed to determine the importance of knowledge innovation for external audit on anti-corruption in the entire Jordanian bank companies are listed in Amman Stock Exchange (ASE). The study importance arises from the need to recognize the Knowledge innovation for external audit and anti-corruption as the development in the world of business, the variables that will be affected by external audit innovation are: reliability of financial data, relevantly of financial data, consistency of the financial data, Full disclosure of financial data and protecting the rights of investors to achieve the objectives of the study a questionnaire was designed and distributed to the society of the Jordanian bank are listed in Amman Stock Exchange. The data analysis found out that the banks in Jordan have a positive importance of Knowledge innovation for external audit on anti-corruption. They agree on the benefit of Knowledge innovation for external audit on anti-corruption. The statistical analysis showed that Knowledge innovation for external audit had a positive impact on the anti-corruption and that external audit has a significantly statistical relationship with anti-corruption, reliability of financial data, consistency of the financial data, a full disclosure of financial data and protecting the rights of investors.Keywords: knowledge innovation, external audit, anti-corruption, Amman Stock Exchange
Procedia PDF Downloads 465