Search results for: Statistical Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16643

Search results for: Statistical Approach

15353 Understanding Tacit Knowledge and DIKW

Authors: Bahadir Aydin

Abstract:

Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.

Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW

Procedia PDF Downloads 502
15352 Competition between Verb-Based Implicit Causality and Theme Structure's Influence on Anaphora Bias in Mandarin Chinese Sentences: Evidence from Corpus

Authors: Linnan Zhang

Abstract:

Linguists, as well as psychologists, have shown great interests in implicit causality in reference processing. However, most frequently-used approaches to this issue are psychological experiments (such as eye tracking or self-paced reading, etc.). This research is a corpus-based one and is assisted with statistical tool – software R. The main focus of the present study is about the competition between verb-based implicit causality and theme structure’s influence on anaphora bias in Mandarin Chinese sentences. In Accessibility Theory, it is believed that salience, which is also known as accessibility, and relevance are two important factors in reference processing. Theme structure, which is a special syntactic structure in Chinese, determines the salience of an antecedent on the syntactic level while verb-based implicit causality is a key factor to the relevance between antecedent and anaphora. Therefore, it is a study about anaphora, combining psychology with linguistics. With analysis of the sentences from corpus as well as the statistical analysis of Multinomial Logistic Regression, major findings of the present study are as follows: 1. When the sentence is stated in a ‘cause-effect’ structure, the theme structure will always be the antecedent no matter forward biased verbs or backward biased verbs co-occur; in non-theme structure, the anaphora bias will tend to be the opposite of the verb bias; 2. When the sentence is stated in a ‘effect-cause’ structure, theme structure will not always be the antecedent and the influence of verb-based implicit causality will outweigh that of theme structure; moreover, the anaphora bias will be the same with the bias of verbs. All the results indicate that implicit causality functions conditionally and the noun in theme structure will not be the high-salience antecedent under any circumstances.

Keywords: accessibility theory, anaphora, theme strcture, verb-based implicit causality

Procedia PDF Downloads 183
15351 Value Engineering and Its Impact on Drainage Design Optimization for Penang International Airport Expansion

Authors: R.M. Asyraf, A. Norazah, S.M. Khairuddin, B. Noraziah

Abstract:

Designing a system at present requires a vital, challenging task; to ensure the design philosophy is maintained in economical ways. This paper perceived the value engineering (VE) approach applied in infrastructure works, namely stormwater drainage. This method is adopted in line as consultants have completed the detailed design. Function Analysis System Technique (FAST) diagram and VE job plan, information, function analysis, creative judgement, development, and recommendation phase are used to scrutinize the initial design of stormwater drainage. An estimated cost reduction using the VE approach of 2% over the initial proposal was obtained. This cost reduction is obtained from the design optimization of the drainage foundation and structural system, where the pile design and drainage base structure are optimized. Likewise, the design of the on-site detention tank (OSD) pump was revised and contribute to the cost reduction obtained. This case study shows that the VE approach can be an important tool in optimizing the design to reduce costs.

Keywords: value engineering, function analysis system technique, stormwater drainage, cost reduction

Procedia PDF Downloads 131
15350 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 397
15349 The Determinants of Customer’s Purchase Intention of Islamic Credit Card: Evidence from Pakistan

Authors: Nasir Mehmood, Muhammad Yar Khan, Anam Javeed

Abstract:

This study aims to scrutinize the dynamics which tend to impact customer’s purchasing intention of Islamic credit card and nexus of product’s knowledge and religiosity with the attitude of potential Islamic credit card’s customer. The theory of reasoned action strengthened the idea that intentions due to its proven predictive power are most likely to instigate intended consumer behavior. Particularly, the study examines the relationships of perceived financial cost (PFC), subjective norms (SN), and attitude (ATT) with the intention to purchase Islamic credit cards. Using a convenience sampling approach, data have been collected from 450 customers of banks located in Rawalpindi and Islamabad. A five-point Likert scale self-administered questionnaire was used to collect the data. The data were analyzed using the Statistical Package of Social Sciences (SPSS) through the procedures of principal component and multiple regression analysis. The results suggested that customer’s religiosity and product knowledge are strong indicators of attitude towards buying Islamic credit cards. Likewise, subjective norms, attitude, and perceived financial cost have a significant positive impact on customers’ purchase intent of Islamic bank’s credit cards. This study models a useful path for future researchers to further investigate the underlined phenomenon along with a variety of psychodynamic factors which are still in its infancy, at least in the Pakistani banking sector. The study also provides an insight to the practitioners and Islamic bank managers for directing their efforts toward educating customers regarding the use of Islamic credit cards and other financial products.

Keywords: attitude, Islamic credit card, religiosity, subjective norms

Procedia PDF Downloads 127
15348 Study on the Relationship between Obesity Indicators and Mineral Status in Qatari Adults

Authors: Alaa A. H. Shehada, Eman Abdelnasser Abouhassanein, Reem Mohsen Ali, Joyce J. Moawad, Hiba Bawadi, Abdelhamid Kerkadi

Abstract:

Background: The association between obesity and micronutrient deficiencies is well documented. Among minerals that have been widely studied: zinc, iron and magnesium. Objectives: This study aims to determine the association between obesity indices and mineral status among Qatari adults. Methods: Secondary data was obtained from Qatar Biobank. 414 healthy Qatari aged 20-50 years old were randomly selected from the database. Anthropometric measurements (WC, Weight, and height), body fat, and mineral status (Fe, Mg, Ca, K, Na) were obtained for all selected participants. Differences in anthropometric measurements and mineral status were analyzed by t-test or ANOVA. Spearman correlation coefficients were determined to assess the association between minerals and anthropometric variables. Statistical significance for the hypothesis tests was set at p <0.05. All statistical analysis was preformed using SPSS software version 23.0. Results: Iron, calcium, and sodium levels decreased with an increase in body mass index. Moreover, only iron showed a significant correlation with waist circumference, and waist to height ratio increased. Additionally, calcium, iron, magnesium, and sodium had a statistically significant negative correlation with total body fat percentage and trunk fat percentage. There were statistically significant negative correlations of anthropometrics with minerals. Conclusion: Body fat and trunk fat percentage had a significant inverse relationship with iron, calcium, sodium, and magnesium, while there was no correlation between body fat or trunk fat percentage with potassium.

Keywords: Qatar biobank, body fat distribution, mineral status, Qatari adults

Procedia PDF Downloads 130
15347 Predictive Semi-Empirical NOx Model for Diesel Engine

Authors: Saurabh Sharma, Yong Sun, Bruce Vernham

Abstract:

Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model.  Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.

Keywords: diesel engine, machine learning, NOₓ emission, semi-empirical

Procedia PDF Downloads 101
15346 Variable Mapping: From Bibliometrics to Implications

Authors: Przemysław Tomczyk, Dagmara Plata-Alf, Piotr Kwiatek

Abstract:

Literature review is indispensable in research. One of the key techniques used in it is bibliometric analysis, where one of the methods is science mapping. The classic approach that dominates today in this area consists of mapping areas, keywords, terms, authors, or citations. This approach is also used in relation to the review of literature in the field of marketing. The development of technology has resulted in the fact that researchers and practitioners use the capabilities of software available on the market for this purpose. The use of science mapping software tools (e.g., VOSviewer, SciMAT, Pajek) in recent publications involves the implementation of a literature review, and it is useful in areas with a relatively high number of publications. Despite this well-grounded science mapping approach having been applied in the literature reviews, performing them is a painstaking task, especially if authors would like to draw precise conclusions about the studied literature and uncover potential research gaps. The aim of this article is to identify to what extent a new approach to science mapping, variable mapping, takes advantage of the classic science mapping approach in terms of research problem formulation and content/thematic analysis for literature reviews. To perform the analysis, a set of 5 articles on customer ideation was chosen. Next, the analysis of key words mapping results in VOSviewer science mapping software was performed and compared with the variable map prepared manually on the same articles. Seven independent expert judges (management scientists on different levels of expertise) assessed the usability of both the stage of formulating, the research problem, and content/thematic analysis. The results show the advantage of variable mapping in the formulation of the research problem and thematic/content analysis. First, the ability to identify a research gap is clearly visible due to the transparent and comprehensive analysis of the relationships between the variables, not only keywords. Second, the analysis of relationships between variables enables the creation of a story with an indication of the directions of relationships between variables. Demonstrating the advantage of the new approach over the classic one may be a significant step towards developing a new approach to the synthesis of literature and its reviews. Variable mapping seems to allow scientists to build clear and effective models presenting the scientific achievements of a chosen research area in one simple map. Additionally, the development of the software enabling the automation of the variable mapping process on large data sets may be a breakthrough change in the field of conducting literature research.

Keywords: bibliometrics, literature review, science mapping, variable mapping

Procedia PDF Downloads 96
15345 The Impact of Quality Cost on Revenue Sharing in Supply Chain Management

Authors: Fayza M. Obied-Allah

Abstract:

Customer’ needs, quality, and value creation while reducing costs through supply chain management provides challenges and opportunities for companies and researchers. In the light of these challenges, modern ideas must contribute to counter these challenges and exploit opportunities. Perhaps this paper will be one of these contributions. This paper discusses the impact of the quality cost on revenue sharing as a most important incentive to configure business networks. No doubt that the costs directly affect the size of income generated by a business network, so this paper investigates the impact of quality costs on business networks revenue, and their impact on the decision to participate the revenue among the companies in the supply chain. This paper develops the quality cost approach to align with the modern era, the developed model includes five categories besides the well-known four categories (namely prevention costs, appraisal costs, internal failure costs, and external failure costs), a new category has been developed in this research as a new vision of the relationship between quality costs and innovations of industry. This new category is Recycle Cost. This paper is organized into six sections, Section I shows quality costs overview in the supply chain. Section II discusses revenue sharing between the parties in supply chain. Section III investigates the impact of quality costs in revenue sharing decision between partners in supply chain. The fourth section includes survey study and presents statistical results. Section V discusses the results and shows future opportunities for research. Finally, Section VI summarizes the theoretical and practical results of this paper.

Keywords: quality cost, recycle cost, revenue sharing, supply chain management

Procedia PDF Downloads 427
15344 Challenges Faced by the Visually Impaired Children and their Parents in Doing Homework Assignments Using Braille

Authors: Shazia Farooq Mirza

Abstract:

The purpose of this study was to explore the challenges faced by visually impaired children and their parents in doing homework assignments using Braille. This study had a quantitative approach and it was descriptive in nature. It took place in 6 public and special private schools of Lahore.177 visually impaired children of grades 4-10 and 153 parents of the visually impaired children were the volunteer participants of this study which were selected through a convenient sampling method. A survey method was adopted for the data collection. And for this purpose 2 self-developed validated questionnaires were used as instruments. The instruments were constructed by exploring the factors and sub-factors from the literature review. Thirty students with visual impairment and 30 parents of the students with visual impairment filled the questionnaires as a pilot study, and it ensured the reliability of the instruments. Data were analyzed using a statistical package of social sciences and it was completely interpreted. Findings revealed that the common challenges faced by the students with visual impairment were Physical Stress, Readiness, Braille Knowledge, Braille Skill and Communications. And the major challenges faced by the parents of the students with visual impairment were the Availability of the helping material, the Availability of the reading material, Braille Knowledge, Braille skills, School and family interactions, Behavior management and the Environment and equipment. Conclusions were drawn on the basis of the major findings. Future suggestions are given in light of the conclusions. This study will be beneficial for the children with visual impairment, the parents of the children with visual impairment, the special education teachers and for the policymakers of the special schools.

Keywords: challenges, visually impaired children, homework, parents, braille

Procedia PDF Downloads 100
15343 Analysis of Earthquake Potential and Shock Level Scenarios in South Sulawesi

Authors: Takhul Bakhtiar

Abstract:

In South Sulawesi Province, there is an active Walanae Fault causing this area to frequently experience earthquakes. This study aims to determine the level of seismicity of the earthquake in order to obtain the potential for earthquakes in the future. The estimation of the potential for earthquakes is then made a scenario model determine the estimated level of shocks as an effort to mitigate earthquake disasters in the region. The method used in this study is the Gutenberg Richter Method through the statistical likelihood approach. This study used earthquake data in the South Sulawesi region in 1972 - 2022. The research location is located at the coordinates of 3.5° – 5.5° South Latitude and 119.5° – 120.5° East Longitude and divided into two segments, namely the northern segment at the coordinates of 3.5° – 4.5° South Latitude and 119,5° – 120,5° East Longitude then the southern segment with coordinates of 4.5° – 5.5° South Latitude and 119,5° – 120.5° East Longitude. This study uses earthquake parameters with a magnitude > 1 and a depth < 50 km. The results of the analysis show that the potential for earthquakes in the next ten years with a magnitude of M = 7 in the northern segment is estimated at 98.81% with an estimated shock level of VI-VII MMI around the cities of Pare-Pare, Barru, Pinrang and Soppeng then IV - V MMI in the cities of Bulukumba, Selayar, Makassar and Gowa. In the southern segment, the potential for earthquakes in the next ten years with a magnitude of M = 7 is estimated at 32.89% with an estimated VI-VII MMI shock level in the cities of Bulukumba, Selayar, Makassar and Gowa, then III-IV MMI around the cities of Pare-Pare, Barru, Pinrang and Soppeng.

Keywords: Gutenberg Richter, likelihood method, seismicity, shakemap and MMI scale

Procedia PDF Downloads 107
15342 An Essay on Origamic and Isomorphic Approach as Interface of Form in Architectural Basic Design Education

Authors: Gamze Atay, Altay Colak

Abstract:

It is a fact that today's technology shapes the change and development of architectural forms by creating different perspectives. The research is an experimental study that explores the integration of architectural forms in this process of change/development into design education through traditional design tools. An examination of the practices in the studio environment shows that the students who just started architectural education have difficulty accessing the form. The main objective of this study has been to enable students to use and interpret different disciplines in the design process to improve their perception of form. In this sense, the origami, which is defined as "the art of paper folding", and isomorphous (equally formed) approaches have been used with design studio students at the beginning stage as methods in the process of 3-dimensional thinking and creating the form. These two methods were examined with students in three stages: analysis, creation, and outcome. As a result of the study, it was seen that the use of different disciplines as a method during form creation gave the designs of the student originality, freedom, and dynamism.

Keywords: architectural form, design education, isomorphic approach, origamic approach

Procedia PDF Downloads 130
15341 The Role of Artificial Intelligence in Criminal Procedure

Authors: Herke Csongor

Abstract:

The artificial intelligence (AI) has been used in the United States of America in the decisionmaking process of the criminal justice system for decades. In the field of law, including criminal law, AI can provide serious assistance in decision-making in many places. The paper reviews four main areas where AI still plays a role in the criminal justice system and where it is expected to play an increasingly important role. The first area is the predictive policing: a number of algorithms are used to prevent the commission of crimes (by predicting potential crime locations or perpetrators). This may include the so-called linking hot-spot analysis, crime linking and the predictive coding. The second area is the Big Data analysis: huge amounts of data sets are already opaque to human activity and therefore unprocessable. Law is one of the largest producers of digital documents (because not only decisions, but nowadays the entire document material is available digitally), and this volume can only and exclusively be handled with the help of computer programs, which the development of AI systems can have an increasing impact on. The third area is the criminal statistical data analysis. The collection of statistical data using traditional methods required enormous human resources. The AI is a huge step forward in that it can analyze the database itself, based on the requested aspects, a collection according to any aspect can be available in a few seconds, and the AI itself can analyze the database and indicate if it finds an important connection either from the point of view of crime prevention or crime detection. Finally, the use of AI during decision-making in both investigative and judicial fields is analyzed in detail. While some are skeptical about the future role of AI in decision-making, many believe that the question is not whether AI will participate in decision-making, but only when and to what extent it will transform the current decision-making system.

Keywords: artificial intelligence, international criminal cooperation, planning and organizing of the investigation, risk assessment

Procedia PDF Downloads 20
15340 Efficacy of the Use of Different Teaching Approaches of Math Teachers

Authors: Nilda San Miguel, Elymar Pascual

Abstract:

The main focus of this study is exploring the effective approaches in teaching Mathematics that is being applied in public schools, s.y. 2018-2019. This research was written as connected output to the district-wide School Learning Action Cell (DISLAC) on Math teaching approaches which was recently conducted in Victoria, Laguna. Fifty-four math teachers coming from 17 schools in Victoria became the respondents of this study. Qualitative method of doing research was applied. Teachers’ responses to the following concerns were gathered, analyzed and interpreted: (1) evaluation of the recently conducted DISLAC, (2) status of the use of different approaches, (3) perception on the effective use of approaches, (4) preference of approach to explore in classroom sessions, (5) factors affecting the choice of approach, (6) difficulties encountered, (7) and perceived benefit to learners. Results showed that the conduct of DISLAC was very highly satisfactory (mean 4.41). Teachers looked at collaborative approach as very highly effective (mean 4.74). Fifty-two percent of the teachers is using collaborative approach, 17% constructivist, 11% integrative, 11% inquiry-based, and 9% reflective. Reflective approach was chosen to be explored by most of the respondents (29%) in future sessions. The difficulties encountered by teachers in using the different approaches are: (1) learners’ difficulty in following instructions, (2) lack of focus, (3) lack of willingness and cooperation, (4) teachers’ lack of mastery in using different approaches, and (5) lack of time of doing visual aids because of time mismanagement. Teachers deemed the use of various teaching approaches can help the learners to have (1) mastery of competency, (2) increased communication, (3) improved confidence, (4) facility in comprehension, and (5) better academic output. The result obtained from this study can be used as an input for SLACs. Recommendations at the end of the study were given to school/district heads and future researchers.

Keywords: approaches, collaborative, constructivism, inquiry-based, integrative, reflective

Procedia PDF Downloads 263
15339 The Importance of the Historical Approach in the Linguistic Research

Authors: Zoran Spasovski

Abstract:

The paper shortly discusses the significance and the benefits of the historical approach in the research of languages by presenting examples of it in the fields of phonetics and phonology, lexicology, morphology, syntax, and even in the onomastics (toponomy and anthroponomy). The examples from the field of phonetics/phonology include insights into animal speech and its evolution into human speech, the evolution of the sounds of human speech from vocals to glides and consonants and from velar consonants to palatal, etc., on well-known examples of former researchers. Those from the field of lexicology show shortly the formation of the lexemes and their evolution; the morphology and syntax are explained by examples of the development of grammar and syntax forms, and the importance of the historical approach in the research of place-names and personal names is briefly outlined through examples of place-names and personal names and surnames, and the conclusions that come from it, in different languages.

Keywords: animal speech, glotogenesis, grammar forms, lexicology, place-names, personal names, surnames, syntax categories

Procedia PDF Downloads 63
15338 Discriminant Shooting-Related Statistics between Winners and Losers 2023 FIBA U19 Basketball World Cup

Authors: Navid Ebrahmi Madiseh, Sina Esfandiarpour-Broujeni, Rahil Razeghi

Abstract:

Introduction: Quantitative analysis of game-related statistical parameters is widely used to evaluate basketball performance at both individual and team levels. Non-free throw shooting plays a crucial role as the primary scoring method, holding significant importance in the game's technical aspect. It has been explored the predictive value of game-related statistics in relation to various contextual and situational variables. Many similarities and differences also have been found between different age groups and levels of competition. For instance, in the World Basketball Championships after the 2010 rule change, 2-point field goals distinguished winners from losers in women's games but not in men's games, and the impact of successful 3-point field goals on women's games was minimal. The study aimed to identify and compare discriminant shooting-related statistics between winning and losing teams in men’s and women’s FIBA-U19-Basketball-World-Cup-2023 tournaments. Method: Data from 112 observations (2 per game) of 16 teams (for each gender) in the FIBA-U19-Basketball-World-Cup-2023 were selected as samples. The data were obtained from the official FIBA website using Python. Specific information was extracted, organized into a DataFrame, and consisted of twelve variables, including shooting percentages, attempts, and scoring ratio for 3-pointers, mid-range shots, paint shots, and free throws. Made% = scoring type successful attempts/scoring type total attempts¬ (1)Free-throw-pts% (free throw score ratio) = (free throw score/total score) ×100 (2)Mid-pts% (mid-range score ratio) = (mid-range score/total score) ×100 (3) Paint-pts% (paint score ratio) = (Paint score/total score) ×100 (4) 3p_pts% (three-point score ratio) = (three-point score/total score) ×100 (5) Independent t-tests were used to examine significant differences in shooting-related statistical parameters between winning and losing teams for both genders. Statistical significance was p < 0.05. All statistical analyses were completed with SPSS, Version 18. Results: The results showed that 3p-made%, mid-pts%, paint-made%, paint-pts%, mid-attempts, and paint-attempts were significantly different between winners and losers in men (t=-3.465, P<0.05; t=3.681, P<0.05; t=-5.884, P<0.05; t=-3.007, P<0.05; t=2.549, p<0.05; t=-3.921, P<0.05). For women, significant differences between winners and losers were found for 3p-made%, 3p-pts%, paint-made%, and paint-attempt (t=-6.429, P<0.05; t=-1.993, P<0.05; t=-1.993, P<0.05; t=-4.115, P<0.05; t=02.451, P<0.05). Discussion: The research aimed to compare shooting-related statistics between winners and losers in men's and women's teams at the FIBA-U19-Basketball-World-Cup-2023. Results indicated that men's winners excelled in 3p-made%, paint-made%, paint-pts%, paint-attempts, and mid-attempt, consistent with previous studies. This study found that losers in men’s teams had higher mid-pts% than winners, which was inconsistent with previous findings. It has been indicated that winners tend to prioritize statistically efficient shots while forcing the opponent to take mid-range shots. In women's games, significant differences in 3p-made%, 3p-pts%, paint-made%, and paint-attempts were observed, indicating that winners relied on riskier outside scoring strategies. Overall, winners exhibited higher accuracy in paint and 3P shooting than losers, but they also relied more on outside offensive strategies. Additionally, winners acquired a higher ratio of their points from 3P shots, which demonstrates their confidence in their skills and willingness to take risks at this competitive level.

Keywords: gender, losers, shoot-statistic, U19, winners

Procedia PDF Downloads 78
15337 Africa and the Gas Supply Crisis to European Countries under the Russian-Ukrainian War: A Study on the Nigerian-Algerian Gas Pipeline project Importance

Authors: Mohammed Lamine Benaouda

Abstract:

This paper seeks to shed light on the African continent role with the crisis of natural gas supplies to European countries, which resulted from the repercussions of the Russian-Ukrainian war, by examining the case of re-launching the Trans-Saharan Gas Pipeline project Nigeria-Algeria, and clarifying the strategic importance This project is mutually beneficial in the long run. The paper relied on the analytical and statistical method in order to find out the the impact that the project represents on the huge needs of the European gas market on the one hand, and monitoring the various economic gains for Algeria and Nigeria on the other hand, in addition, the comparative approach to assess the possible effects of the success and feasibility of the project economy for all its beneficiaries. The paper founds that the complexity has multiplied in the global energy market in general and the European one in particular, following what the world witnessed from the repercussions of the Russian-Ukrainian war, as well as the extreme importance of the poles of African countries in the arena of the international struggle over resources, which allows them a margin From maneuvering and regional and global influence in various fields. With regard to the research outcoms and the future scope, the researcher believes that the African continent, in light of international competition and conflict, as well as what the world is witnessing in terms of restoring balances of power in the current international system, will play very important roles, especially with its enormous natural and human capabilities, which enable it to Weighting future conflicts over energy and spheres of influence.

Keywords: algeria, nigeria, west africa, ECOWAS, gas supplies, russia, ukrain

Procedia PDF Downloads 58
15336 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique

Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu

Abstract:

Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.

Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing

Procedia PDF Downloads 73
15335 E-Management and Firm Performance: An Empirical Study in Tunisian Firms

Authors: Khlif Hamadi

Abstract:

The principal aim of our research is to analyze the impact of the adoption of e-management approach on the performance of Tunisian firms. The method of structural equation was adopted to conduct our exploratory and confirmatory analysis. The results arising from the questionnaire sent to 155 E-managers affirm that the adoption of e-management approach influences the performance of Tunisian firms. The results of the questionnaire show that e-management favors the deployment of ICT usage and contributes enormously to the performance of the modern enterprise. The theoretical and practical implications of the study, as well as directions for future research, are discussed.

Keywords: e-management, ICT Deployment, organizational performance, e-manager

Procedia PDF Downloads 326
15334 Interaction between Space Syntax and Agent-Based Approaches for Vehicle Volume Modelling

Authors: Chuan Yang, Jing Bie, Panagiotis Psimoulis, Zhong Wang

Abstract:

Modelling and understanding vehicle volume distribution over the urban network are essential for urban design and transport planning. The space syntax approach was widely applied as the main conceptual and methodological framework for contemporary vehicle volume models with the help of the statistical method of multiple regression analysis (MRA). However, the MRA model with space syntax variables shows a limitation in vehicle volume predicting in accounting for the crossed effect of the urban configurational characters and socio-economic factors. The aim of this paper is to construct models by interacting with the combined impact of the street network structure and socio-economic factors. In this paper, we present a multilevel linear (ML) and an agent-based (AB) vehicle volume model at an urban scale interacting with space syntax theoretical framework. The ML model allowed random effects of urban configurational characteristics in different urban contexts. And the AB model was developed with the incorporation of transformed space syntax components of the MRA models into the agents’ spatial behaviour. Three models were implemented in the same urban environment. The ML model exhibit superiority over the original MRA model in identifying the relative impacts of the configurational characters and macro-scale socio-economic factors that shape vehicle movement distribution over the city. Compared with the ML model, the suggested AB model represented the ability to estimate vehicle volume in the urban network considering the combined effects of configurational characters and land-use patterns at the street segment level.

Keywords: space syntax, vehicle volume modeling, multilevel model, agent-based model

Procedia PDF Downloads 123
15333 Systematic Approach for Energy-Supply-Orientated Production Planning

Authors: F. Keller, G. Reinhart

Abstract:

The efficient and economic allocation of resources is one main goal in the field of production planning and control. Nowadays, a new variable gains in importance throughout the planning process: Energy. Energy-efficiency has already been widely discussed in literature, but with a strong focus on reducing the overall amount of energy used in production. This paper provides a brief systematic approach, how energy-supply-orientation can be used for an energy-cost-efficient production planning and thus combining the idea of energy-efficiency and energy-flexibility.

Keywords: production planning, production control, energy-efficiency, energy-flexibility, energy-supply

Procedia PDF Downloads 624
15332 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers

Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal

Abstract:

Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.

Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test

Procedia PDF Downloads 82
15331 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa

Authors: Samy A. Khalil, U. Ali Rahoma

Abstract:

The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.

Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa

Procedia PDF Downloads 81
15330 Understanding Rural Teachers’ Perceived Intention of Using Play in ECCE Mathematics Classroom: Strength-Based Approach

Authors: Nyamela M. ‘Masekhohola, Khanare P. Fumane

Abstract:

The Lesotho downward trend in mathematics attainment at all levels is compounded by the absence of innovative approaches to teaching and learning in Early Childhood. However, studies have shown that play pedagogy can be used to mitigate the challenges of mathematics education. Despite the benefits of play pedagogy to rural learners, its full potential has not been realized in early childhood care and education classrooms to improve children’s performance in mathematics because the adoption of play pedagogy depends on a strength-based approach. The study explores the potential of play pedagogy to improve mathematics education in early childhood care and education in Lesotho. Strength-based approach is known for its advocacy of recognizing and utilizing children’s strengths, capacities and interests. However, this approach and its promisingattributes is not well-known in Lesotho. In particular, little is known about the attributes of play pedagogy that are essential to improve mathematic education in ECCE programs in Lesotho. To identify such attributes and strengthen mathematics education, this systematic review examines evidence published on the strengths of play pedagogy that supports the teaching and learning of mathematics education in ECCE. The purpose of this review is, therefore, to identify and define the strengths of play pedagogy that supports mathematics education. Moreover, the study intends to understand the rural teachers’ perceived intention of using play in ECCE math classrooms through a strength-based approach. Eight key strengths were found (cues for reflection, edutainment, mathematics language development, creativity and imagination, cognitive promotion, exploration, classification, and skills development). This study is the first to identify and define the strength-based attributes of play pedagogy to improve the teaching and learning of mathematics in ECCE centers in Lesotho. The findings reveal which opportunities teachers find important for improving the teaching of mathematics as early as in ECCE programs. We conclude by discussing the implications of the literature for stimulating dialogues towards formulating strength-based approaches to teaching mathematics, as well as reflecting on the broader contributions of play pedagogy as an asset to improve mathematics in Lesotho and beyond.

Keywords: early childhood education, mathematics education, lesotho, play pedagogy, strength-based approach.

Procedia PDF Downloads 117
15329 New Gas Geothermometers for the Prediction of Subsurface Geothermal Temperatures: An Optimized Application of Artificial Neural Networks and Geochemometric Analysis

Authors: Edgar Santoyo, Daniel Perez-Zarate, Agustin Acevedo, Lorena Diaz-Gonzalez, Mirna Guevara

Abstract:

Four new gas geothermometers have been derived from a multivariate geo chemometric analysis of a geothermal fluid chemistry database, two of which use the natural logarithm of CO₂ and H2S concentrations (mmol/mol), respectively, and the other two use the natural logarithm of the H₂S/H₂ and CO₂/H₂ ratios. As a strict compilation criterion, the database was created with gas-phase composition of fluids and bottomhole temperatures (BHTM) measured in producing wells. The calibration of the geothermometers was based on the geochemical relationship existing between the gas-phase composition of well discharges and the equilibrium temperatures measured at bottomhole conditions. Multivariate statistical analysis together with the use of artificial neural networks (ANN) was successfully applied for correlating the gas-phase compositions and the BHTM. The predicted or simulated bottomhole temperatures (BHTANN), defined as output neurons or simulation targets, were statistically compared with measured temperatures (BHTM). The coefficients of the new geothermometers were obtained from an optimized self-adjusting training algorithm applied to approximately 2,080 ANN architectures with 15,000 simulation iterations each one. The self-adjusting training algorithm used the well-known Levenberg-Marquardt model, which was used to calculate: (i) the number of neurons of the hidden layer; (ii) the training factor and the training patterns of the ANN; (iii) the linear correlation coefficient, R; (iv) the synaptic weighting coefficients; and (v) the statistical parameter, Root Mean Squared Error (RMSE) to evaluate the prediction performance between the BHTM and the simulated BHTANN. The prediction performance of the new gas geothermometers together with those predictions inferred from sixteen well-known gas geothermometers (previously developed) was statistically evaluated by using an external database for avoiding a bias problem. Statistical evaluation was performed through the analysis of the lowest RMSE values computed among the predictions of all the gas geothermometers. The new gas geothermometers developed in this work have been successfully used for predicting subsurface temperatures in high-temperature geothermal systems of Mexico (e.g., Los Azufres, Mich., Los Humeros, Pue., and Cerro Prieto, B.C.) as well as in a blind geothermal system (known as Acoculco, Puebla). The last results of the gas geothermometers (inferred from gas-phase compositions of soil-gas bubble emissions) compare well with the temperature measured in two wells of the blind geothermal system of Acoculco, Puebla (México). Details of this new development are outlined in the present research work. Acknowledgements: The authors acknowledge the funding received from CeMIE-Geo P09 project (SENER-CONACyT).

Keywords: artificial intelligence, gas geochemistry, geochemometrics, geothermal energy

Procedia PDF Downloads 329
15328 Lateral Retroperitoneal Transpsoas Approach: A Practical Minimal Invasive Surgery Option for Treating Pyogenic Spondylitis of the Lumbar Vertebra

Authors: Sundaresan Soundararajan, Chor Ngee Tan

Abstract:

Introduction: Pyogenic spondylitis, otherwise treated conservatively with long term antibiotics, would require surgical debridement and reconstruction in about 10% to 20% of cases. The classical approach adopted many surgeons have always been anterior approach in ensuring thorough and complete debridement. This, however, comes with high rates of morbidity due to the nature of its access. Direct lateral retroperitoneal approach, which has been growing in usage in degenerative lumbar diseases, has the potential in treating pyogenic spondylitis with its ease of approach and relatively low risk of complications. Aims/Objectives: The objective of this study was to evaluate the effectiveness and clinical outcome of using lateral approach surgery in the surgical management of pyogenic spondylitis of the lumbar spine. Methods: Retrospective chart analysis was done on all patients who presented with pyogenic spondylitis (lumbar discitis/vertebral osteomyelitis) and had undergone direct lateral retroperitoneal lumbar vertebral debridement and posterior instrumentation between 2014 and 2016. Data on blood loss, surgical operating time, surgical complications, clinical outcomes and fusion rates were recorded. Results: A total of 6 patients (3 male and 3 female) underwent this procedure at a single institution by a single surgeon during the defined period. One patient presented with infected implant (PLIF) and vertebral osteomyelitis while the other five presented with single level spondylodiscitis. All patients underwent lumbar debridement, iliac strut grafting and posterior instrumentation (revision of screws for infected PLIF case). The mean operating time was 308.3 mins for all 6 cases. Mean blood loss was reported at 341cc (range from 200cc to 600cc). Presenting symptom of back pain resolved in all 6 cases while 2 cases that presented with lower limb weakness had improvement of neurological deficits. One patient had dislodged strut graft while performing posterior instrumentation and needed graft revision intraoperatively. Infective markers normalized for all patients subsequently. All subjects also showed radiological evidence of fusion on 6 months follow up. Conclusions: Lateral approach in treating pyogenic spondylitis is a viable option as it allows debridement and reconstruction without the risk that comes with other anterior approaches. It allows efficient debridement, short surgical time, moderate blood loss and low risk of vascular injuries. Clinical outcomes and fusion rates by this approach also support its use as practical MIS option surgery for such infection cases.

Keywords: lateral approach, minimally invasive, pyogenic spondylitis, XLIF

Procedia PDF Downloads 160
15327 Active Features Determination: A Unified Framework

Authors: Meenal Badki

Abstract:

We address the issue of active feature determination, where the objective is to determine the set of examples on which additional data (such as lab tests) needs to be gathered, given a large number of examples with some features (such as demographics) and some examples with all the features (such as the complete Electronic Health Record). We note that certain features may be more costly, unique, or laborious to gather. Our proposal is a general active learning approach that is independent of classifiers and similarity metrics. It allows us to identify examples that differ from the full data set and obtain all the features for the examples that match. Our comprehensive evaluation shows the efficacy of this approach, which is driven by four authentic clinical tasks.

Keywords: feature determination, classification, active learning, sample-efficiency

Procedia PDF Downloads 54
15326 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects

Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta

Abstract:

Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.

Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect

Procedia PDF Downloads 202
15325 Calculating Approach of Thermal Conductivity of 8 YSZ in Different Relative Humidities Corresponding to Low Water Contents

Authors: Yun Chol Kang, Myong Nam Kong, Nam Chol Yu, Jin Sim Kim, Un Yong Paek, Song Ho Kim

Abstract:

This study focuses on the calculating approach of the thermal conductivity of 8 mol% yttria-stabilized zirconia (8YSZ) in different relative humidity corresponding to low water contents. When water content in 8YSZ is low, water droplets can accumulate in the neck regions. We assume that spherical water droplets are randomly located in the neck regions formed by grains and surrounded by the pores. Based on this, a new hypothetical pore constituted by air and water is proposed using the microstructural modeling. We consider 8YSZ is a two-phase material constituted by the solid region and the hypothetical pore region where the water droplets are penetrated in the pores, randomly. The results showed that the thermal conductivity of the hypothetical pore is calculated using the parallel resistance for low water contents, and the effective thermal conductivity of 8YSZ material constituted by solid and hypothetical pore in different relative humidities using EMPT. When the numbers of water layers on the surface of 8YSZ are less than 1.5, the proposed approach gives a good interpretation of the experimental results. When the theoretical value of the number of water layers on 8YSZ surface is 1, the water content is not enough to cover the internal solid surface completely. The proposed approach gives a better interpretation of the experimental results in different relative humidities that numbers of water layers on the surface of 8YSZ are less than 1.5.

Keywords: 8YSZ, microstructure, thermal conductivity, relative humidity

Procedia PDF Downloads 71
15324 The Comparison of Parental Childrearing Styles and Anxiety in Children with Stuttering and Normal Population

Authors: Pegah Farokhzad

Abstract:

Family has a crucial role in maintaining the physical, social and mental health of the children. Most of the mental and anxiety problems of children reflects the complex interpersonal situations among family members, especially parents. In other words, anxiety problems of the children is correlated with deficit relationships of family members and improper child rearing styles. The parental child rearing styles leads to positive and negative consequences which affect the children’s mental health. Therefore, the present research was aimed to compare the parental child rearing styles and anxiety of children with stuttering and normal population. It was also aimed to study the relationship between parental child rearing styles and anxiety of children. The research sample included 54 boys with stuttering and 54 normal boys who were selected from the children (boys) of Tehran, Iran in the age range of 5 to 8 years in 2013. In order to collect data, Baumrind Child rearing Styles Inventory and Spence Parental Anxiety Inventory were used. Appropriate descriptive statistical methods and multivariate variance analysis and t test for independent groups were used to test the study hypotheses. Statistical data analyses demonstrated that there was a significant difference between stuttering boys and normal boys in anxiety (t = 7.601, p< 0.01); But there was no significant difference between stuttering boys and normal boys in parental child rearing styles (F = 0.129). There was also not found significant relationship between parental child rearing styles and children anxiety (F = 0.135, p< 0.05). It can be concluded that the influential factors of children’s society are parents, school, teachers, peers and media. So, parental child rearing styles are not the only influential factors on anxiety of children, and other factors including genetic, environment and child experiences are effective in anxiety as well. Details are discussed.

Keywords: child rearing styles, anxiety, stuttering, Iran

Procedia PDF Downloads 479