Search results for: logical layout analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28383

Search results for: logical layout analysis

27693 Behavior of Steel Moment Frames Subjected to Impact Load

Authors: Hyungoo Kang, Minsung Kim, Jinkoo Kim

Abstract:

This study investigates the performance of a 2D and 3D steel moment frame subjected to vehicle collision at a first story column using LS-DYNA. The finite element models of vehicles provided by the National Crash Analysis Center (NCAC) are used for numerical analysis. Nonlinear dynamic time history analysis of the 2D and 3D model structures are carried out based on the arbitrary column removal scenario, and the vertical displacement of the damaged structures are compared with that obtained from collision analysis. The analysis results show that the model structure remains stable when the speed of the vehicle is 40km/h. However, at the speed of 80 and 120km/h both the 2D and 3D structures collapse by progressive collapse. The vertical displacement of the damaged joint obtained from collision analysis is significantly larger than the displacement computed based on the arbitrary column removal scenario.

Keywords: vehicle collision, progressive collapse, FEM, LS-DYNA

Procedia PDF Downloads 342
27692 Buddhism: Its Socio-Economic Relevance in the Present Changing World

Authors: Bandana Bhattacharya

Abstract:

‘Buddhism’, as such signifies the ‘ism’ that is based on Buddha’s life and teachings or that is concerned with the gospel of Buddha as recorded in the literature available in Pali, Sanskrit, Buddhist Sanskrit, Prakrit and even in the other non-Indian languages wherein it has been described a very abstruse, complex and lofty philosophy of life or ‘the way of life’ preached by Him (Buddha). It has another side too, i.e., the applicability of the tenets of Buddha according to the needs of the present society, where human life and outlook has been totally changed. Applied Buddhism signifies the applicability of the Buddha’s noble tenets. Along with the theological exposition and textual criticism of the Buddha’s discourses, it has now become almost obligatory for the Buddhist scholars to re-interpret Buddhism from modern perspectives. Basically Applied Buddhism defined a ‘way of life’ which may transform the higher quality of life or essence of life due to changed circumstances, places and time. Nowadays, if we observe the present situation of the world, we will find the current problems such as health, economic, politic, global warming, population explosion, pollution of all types including cultural scarcity essential commodities and indiscriminate use of human, natural and water resources are becoming more and more pronounced day by day, under such a backdrop of world situation. Applied Buddhism rather Buddhism may be the only instrument left now for mankind to address all such human achievements, lapses, and problems. Buddha’s doctrine is itself called ‘akālika, timeless’. On the eve of the Mahāparinibbāṇa at Kusinara, the Blessed One allows His disciples to change, modify and alter His minor teachings according to the needs of the future, although He has made some utterances, which would eternally remain fresh. Hence Buddhism has been able to occupy a prominent place in modern life, because of its timeless applicability, emanating from a set of eternal values. The logical and scientific outlook of Buddha may be traced in His very first sermon named the Dhammacakkapavattana-Sutta where He suggested to avoid the two extremes, namely, constantly attachment to sensual pleasures (Kāmasukhallikānuyoga) and devotion to self-mortification that is painful as well as unprofitable and asked to adopt Majjhimapaṭipadā, ‘Middle path’, which is very much applicable even today in every spheres of human life; and the absence of which is the root cause of all problems event at present. This paper will be a humble attempt to highlight the relevance of Buddhism in the present society.

Keywords: applied Buddhism, ecology, self-awareness, value

Procedia PDF Downloads 125
27691 NanoFrazor Lithography for advanced 2D and 3D Nanodevices

Authors: Zhengming Wu

Abstract:

NanoFrazor lithography systems were developed as a first true alternative or extension to standard mask-less nanolithography methods like electron beam lithography (EBL). In contrast to EBL they are based on thermal scanning probe lithography (t-SPL). Here a heatable ultra-sharp probe tip with an apex of a few nm is used for patterning and simultaneously inspecting complex nanostructures. The heat impact from the probe on a thermal responsive resist generates those high-resolution nanostructures. The patterning depth of each individual pixel can be controlled with better than 1 nm precision using an integrated in-situ metrology method. Furthermore, the inherent imaging capability of the Nanofrazor technology allows for markerless overlay, which has been achieved with sub-5 nm accuracy as well as it supports stitching layout sections together with < 10 nm error. Pattern transfer from such resist features below 10 nm resolution were demonstrated. The technology has proven its value as an enabler of new kinds of ultra-high resolution nanodevices as well as for improving the performance of existing device concepts. The application range for this new nanolithography technique is very broad spanning from ultra-high resolution 2D and 3D patterning to chemical and physical modification of matter at the nanoscale. Nanometer-precise markerless overlay and non-invasiveness to sensitive materials are among the key strengths of the technology. However, while patterning at below 10 nm resolution is achieved, significantly increasing the patterning speed at the expense of resolution is not feasible by using the heated tip alone. Towards this end, an integrated laser write head for direct laser sublimation (DLS) of the thermal resist has been introduced for significantly faster patterning of micrometer to millimeter-scale features. Remarkably, the areas patterned by the tip and the laser are seamlessly stitched together and both processes work on the very same resist material enabling a true mix-and-match process with no developing or any other processing steps in between. The presentation will include examples for (i) high-quality metal contacting of 2D materials, (ii) tuning photonic molecules, (iii) generating nanofluidic devices and (iv) generating spintronic circuits. Some of these applications have been enabled only due to the various unique capabilities of NanoFrazor lithography like the absence of damage from a charged particle beam.

Keywords: nanofabrication, grayscale lithography, 2D materials device, nano-optics, photonics, spintronic circuits

Procedia PDF Downloads 72
27690 Fault Tree Analysis and Bayesian Network for Fire and Explosion of Crude Oil Tanks: Case Study

Authors: B. Zerouali, M. Kara, B. Hamaidi, H. Mahdjoub, S. Rouabhia

Abstract:

In this paper, a safety analysis for crude oil tanks to prevent undesirable events that may cause catastrophic accidents. The estimation of the probability of damage to industrial systems is carried out through a series of steps, and in accordance with a specific methodology. In this context, this work involves developing an assessment tool and risk analysis at the level of crude oil tanks system, based primarily on identification of various potential causes of crude oil tanks fire and explosion by the use of Fault Tree Analysis (FTA), then improved risk modelling by Bayesian Networks (BNs). Bayesian approach in the evaluation of failure and quantification of risks is a dynamic analysis approach. For this reason, have been selected as an analytical tool in this study. Research concludes that the Bayesian networks have a distinct and effective method in the safety analysis because of the flexibility of its structure; it is suitable for a wide variety of accident scenarios.

Keywords: bayesian networks, crude oil tank, fault tree, prediction, safety

Procedia PDF Downloads 664
27689 A Study on Sentiment Analysis Using Various ML/NLP Models on Historical Data of Indian Leaders

Authors: Sarthak Deshpande, Akshay Patil, Pradip Pandhare, Nikhil Wankhede, Rushali Deshmukh

Abstract:

Among the highly significant duties for any language most effective is the sentiment analysis, which is also a key area of NLP, that recently made impressive strides. There are several models and datasets available for those tasks in popular and commonly used languages like English, Russian, and Spanish. While sentiment analysis research is performed extensively, however it is lagging behind for the regional languages having few resources such as Hindi, Marathi. Marathi is one of the languages that included in the Indian Constitution’s 8th schedule and is the third most widely spoken language in the country and primarily spoken in the Deccan region, which encompasses Maharashtra and Goa. There isn’t sufficient study on sentiment analysis methods based on Marathi text due to lack of available resources, information. Therefore, this project proposes the use of different ML/NLP models for the analysis of Marathi data from the comments below YouTube content, tweets or Instagram posts. We aim to achieve a short and precise analysis and summary of the related data using our dataset (Dates, names, root words) and lexicons to locate exact information.

Keywords: multilingual sentiment analysis, Marathi, natural language processing, text summarization, lexicon-based approaches

Procedia PDF Downloads 76
27688 Future of Electric Power Generation Technologies: Environmental and Economic Comparison

Authors: Abdulrahman A. Bahaddad, Mohammed Beshir

Abstract:

The objective of this paper is to demonstrate and describe eight different types of power generation technologies and to understand the history and future trends of each technology. In addition, a comparative analysis between these technologies will be presented with respect to their cost analysis and associated performance.

Keywords: conventional power generation, economic analysis, environmental impact, renewable energy power generation

Procedia PDF Downloads 135
27687 Penetration Analysis for Composites Applicable to Military Vehicle Armors, Aircraft Engines and Nuclear Power Plant Structures

Authors: Dong Wook Lee

Abstract:

This paper describes a method for analyzing penetration for composite material using an explicit nonlinear Finite Element Analysis (FEA). This method may be used in the early stage of design for the protection of military vehicles, aircraft engines and nuclear power plant structures made of composite materials. This paper deals with simple ballistic penetration tests for composite materials and the FEA modeling method and results. The FEA was performed to interpret the ballistic field test phenomenon regarding the damage propagation in the structure subjected to local foreign object impact.

Keywords: computer aided engineering, finite element analysis, impact analysis, penetration analysis, composite material

Procedia PDF Downloads 124
27686 Factors Related to Teachers’ Analysis of Classroom Assessments

Authors: Hussain A. Alkharusi, Said S. Aldhafri, Hilal Z. Alnabhani, Muna Alkalbani

Abstract:

Analysing classroom assessments is one of the responsibilities of the teacher. It aims improving teacher’s instruction and assessment as well as student learning. The present study investigated factors that might explain variation in teachers’ practices regarding analysis of classroom assessments. The factors considered in the investigation included gender, in-service assessment training, teaching load, teaching experience, knowledge in assessment, attitude towards quantitative aspects of assessment, and self-perceived competence in analysing assessments. Participants were 246 in-service teachers in Oman. Results of a stepwise multiple linear regression analysis revealed that self-perceived competence was the only significant factor explaining the variance in teachers’ analysis of assessments. Implications for research and practice are discussed.

Keywords: analysis of assessment, classroom assessment, in-service teachers, self-competence

Procedia PDF Downloads 333
27685 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program

Authors: Ming Wen, Nasim Nezamoddini

Abstract:

Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.

Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM

Procedia PDF Downloads 113
27684 Quantile Coherence Analysis: Application to Precipitation Data

Authors: Yaeji Lim, Hee-Seok Oh

Abstract:

The coherence analysis measures the linear time-invariant relationship between two data sets and has been studied various fields such as signal processing, engineering, and medical science. However classical coherence analysis tends to be sensitive to outliers and focuses only on mean relationship. In this paper, we generalized cross periodogram to quantile cross periodogram and provide richer inter-relationship between two data sets. This is a general version of Laplace cross periodogram. We prove its asymptotic distribution under the long range process and compare them with ordinary coherence through numerical examples. We also present real data example to confirm the usefulness of quantile coherence analysis.

Keywords: coherence, cross periodogram, spectrum, quantile

Procedia PDF Downloads 392
27683 Financial Reports and Common Ownership: An Analysis of the Mechanisms Common Owners Use to Induce Anti-Competitive Behavior

Authors: Kevin Smith

Abstract:

Publicly traded company in the US are legally obligated to host earnings calls that discuss their most recent financial reports. During these calls, investors are able to ask these companies questions about these financial reports and on the future direction of the company. This paper examines whether common institutional owners use these calls as a way to indirectly signal to companies in their portfolio to not take actions that could hurt the common owner's interests. This paper uses transcripts taken from the earnings calls of the six largest health insurance companies in the US from 2014 to 2019. This data is analyzed using text analysis and sentiment analysis to look for patterns in the statements made by common owners. The analysis found that common owners where more likely to recommend against direct price competition and instead redirect the insurance companies towards more passive actions, like investing in new technologies. This result indicates a mechanism that common owners use to reduce competition in the health insurance market.

Keywords: common ownership, text analysis, sentiment analysis, machine learning

Procedia PDF Downloads 76
27682 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 72
27681 Against the Idea of Public Power as Free Will

Authors: Donato Vese

Abstract:

According to the common interpretation, in a legal system, public powers are established by law. Exceptions are admitted in an emergency or particular relationship with public power. However, we currently agree that law allows public administration a margin of decision, even in the case of non-discretionary acts. Hence, the administrative decision not exclusively established by law becomes the rule in the ordinary state of things, non-only in state of exception. This paper aims to analyze and discuss different ideas on discretionary power on the Rule of Law and Rechtsstaat. Observing the legal literature in Europe and Nord and South America, discretionary power can be described as follow: it could be considered a margin that law accords to the executive power for political decisions or a choice between different interpretations of vague legal previsions. In essence, this explanation admits for the executive a decision not established by law or anyhow not exclusively established by law. This means that the discretionary power of public administration integrates the law. However, integrating law does not mean to decide according to the law, but it means to integrate law with a decision involving public power. Consequently, discretionary power is essentially free will. In this perspective, also the Rule of Law and the Rechtsstaat are notions explained differently. Recently, we can observe how the European notion of Rechtsstaat is founded on the formal validity of the law; therefore, for this notion, public authority’s decisions not regulated by law represent a problem. Thus, different systems of law integration have been proposed in legal literature, such as values, democracy, reasonableness, and so on. This paper aims to verify how, looking at those integration clauses from a logical viewpoint, integration based on the recourse to the legal system itself does not resolve the problem. The aforementioned integration clauses are legal rules that require hard work to explain the correct meaning of the law; in particular, they introduce dangerous criteria in favor of the political majority. A different notion of public power can be proposed. This notion includes two main features: (a) sovereignty belongs to persons and not the state, and (b) fundamental rights are not grounded but recognized by Constitutions. Hence, public power is a system based on fundamental rights. According to this approach, it can also be defined as the notion of public interest as concrete maximization of fundamental rights enjoyments. Like this, integration of the law, vague or subject to several interpretations, must be done by referring to the system of fundamental individual rights. We can think, for instance, to fundamental rights that are right in an objective view but not legal because not established by law.

Keywords: administrative discretion, free will, fundamental rights, public power, sovereignty

Procedia PDF Downloads 110
27680 Comparative Study of Dynamic Effect on Analysis Approaches for Circular Tanks Using Codal Provisions

Authors: P. Deepak Kumar, Aishwarya Alok, P. R. Maiti

Abstract:

Liquid storage tanks have become widespread during the recent decades due to their extensive usage. Analysis of liquid containing tanks is known to be complex due to hydrodynamic force exerted on tank which makes the analysis a complex one. The objective of this research is to carry out analysis of liquid domain along with structural interaction for various geometries of circular tanks considering seismic effects. An attempt has been made to determine hydrodynamic pressure distribution on the tank wall considering impulsive and convective components of liquid mass. To get a better picture, a comparative study of Draft IS 1893 Part 2, ACI 350.3 and Eurocode 8 for Circular Shaped Tank has been performed. Further, the differences in the magnitude of shear and moment at base as obtained from static (IS 3370 IV) and dynamic (Draft IS 1892 Part 2) analysis of ground supported circular tank highlight the need for us to mature from the old code to a newer code, which is more accurate and reliable.

Keywords: liquid filled containers, circular tanks, IS 1893 (part 2), seismic analysis, sloshing

Procedia PDF Downloads 353
27679 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 333
27678 Argumentative and Enunciative Analysis of Spanish Political Discourse

Authors: Cristina Diez

Abstract:

One of the most important challenges of discourse analysis is to find the linguistic mechanisms of subjectivity. The present article aims to raise the need for an argumentative and enunciative analysis to reach the subjective tissue of language. The intention is to prove that the instructions inscribed in the own language are those that indicate how a statement is to be interpreted and that the argumentative value is implied at the semantic level. For that, the theory of argumentation from Ducrot and Anscombre will be implemented. First, a reflection on the study about subjectivity and enunciation in language will be exposed, followed by concrete proposals on the linguistic mechanisms that speakers use either consciously or unconsciously, to finally focus on those argumentative tools that political discourse uses in order to influence the audience.

Keywords: argumentation, enunciation, discourse analysis, subjectivity

Procedia PDF Downloads 204
27677 Study on 3D FE Analysis on Normal and Osteoporosis Mouse Models Based on 3-Point Bending Tests

Authors: Tae-min Byun, Chang-soo Chon, Dong-hyun Seo, Han-sung Kim, Bum-mo Ahn, Hui-suk Yun, Cheolwoong Ko

Abstract:

In this study, a 3-point bending computational analysis of normal and osteoporosis mouse models was performed based on the Micro-CT image information of the femurs. The finite element analysis (FEA) found 1.68 N (normal group) and 1.39 N (osteoporosis group) in the average maximum force, and 4.32 N/mm (normal group) and 3.56 N/mm (osteoporosis group) in the average stiffness. In the comparison of the 3-point bending test results, the maximum force and the stiffness were different about 9.4 times in the normal group and about 11.2 times in the osteoporosis group. The difference between the analysis and the test was greatly significant and this result demonstrated improvement points of the material properties applied to the computational analysis of this study. For the next study, the material properties of the mouse femur will be supplemented through additional computational analysis and test.

Keywords: 3-point bending test, mouse, osteoporosis, FEA

Procedia PDF Downloads 353
27676 Assessment and Forecasting of the Impact of Negative Environmental Factors on Public Health

Authors: Nurlan Smagulov, Aiman Konkabayeva, Akerke Sadykova, Arailym Serik

Abstract:

Introduction. Adverse environmental factors do not immediately lead to pathological changes in the body. They can exert the growth of pre-pathology characterized by shifts in physiological, biochemical, immunological and other indicators of the body state. These disorders are unstable, reversible and indicative of body reactions. There is an opportunity to objectively judge the internal structure of the adaptive body reactions at the level of individual organs and systems. In order to obtain a stable response of the body to the chronic effects of unfavorable environmental factors of low intensity (compared to production environment factors), a time called the «lag time» is needed. The obtained results without considering this factor distort reality and, for the most part, cannot be a reliable statement of the main conclusions in any work. A technique is needed to reduce methodological errors and combine mathematical logic using statistical methods and a medical point of view, which ultimately will affect the obtained results and avoid a false correlation. Objective. Development of a methodology for assessing and predicting the environmental factors impact on the population health considering the «lag time.» Methods. Research objects: environmental and population morbidity indicators. The database on the environmental state was compiled from the monthly newsletters of Kazhydromet. Data on population morbidity were obtained from regional statistical yearbooks. When processing static data, a time interval (lag) was determined for each «argument-function» pair. That is the required interval, after which the harmful factor effect (argument) will fully manifest itself in the indicators of the organism's state (function). The lag value was determined by cross-correlation functions of arguments (environmental indicators) with functions (morbidity). Correlation coefficients (r) and their reliability (t), Fisher's criterion (F) and the influence share (R2) of the main factor (argument) per indicator (function) were calculated as a percentage. Results. The ecological situation of an industrially developed region has an impact on health indicators, but it has some nuances. Fundamentally opposite results were obtained in the mathematical data processing, considering the «lag time». Namely, an expressed correlation was revealed after two databases (ecology-morbidity) shifted. For example, the lag period was 4 years for dust concentration, general morbidity, and 3 years – for childhood morbidity. These periods accounted for the maximum values of the correlation coefficients and the largest percentage of the influencing factor. Similar results were observed in relation to the concentration of soot, dioxide, etc. The comprehensive statistical processing using multiple correlation-regression variance analysis confirms the correctness of the above statement. This method provided the integrated approach to predicting the degree of pollution of the main environmental components to identify the most dangerous combinations of concentrations of leading negative environmental factors. Conclusion. The method of assessing the «environment-public health» system (considering the «lag time») is qualitatively different from the traditional (without considering the «lag time»). The results significantly differ and are more amenable to a logical explanation of the obtained dependencies. The method allows presenting the quantitative and qualitative dependence in a different way within the «environment-public health» system.

Keywords: ecology, morbidity, population, lag time

Procedia PDF Downloads 82
27675 Examining the Changes in Complexity, Accuracy, and Fluency in Japanese L2 Writing Over an Academic Semester

Authors: Robert Long

Abstract:

The results of a one-year study on the evolution of complexity, accuracy, and fluency (CAF) in the compositions of Japanese L2 university students throughout a semester are presented in this study. One goal was to determine if any improvement in writing abilities over this academic term had occurred, while another was to examine methods of editing. Participants had 30 minutes to write each essay with an additional 10 minutes allotted for editing. As for editing, participants were divided into two groups, one of which utilized an online grammar checker, while the other half self-edited their initial manuscripts. From the three different institutions, there was a total of 159 students. Research questions focused on determining if the CAF had evolved over the previous year, identifying potential variations in editing techniques, and describing the connections between the CAF dimensions. According to the findings, there was some improvement in accuracy (fewer errors) in all three of the measures), whereas there was a marked decline in complexity and fluency. As for the second research aim relating to the interaction among the three dimensions (CAF) and of possible increases in fluency being offset by decreases in grammatical accuracy, results showed (there is a logical high correlation with clauses and word counts, and mean length of T-unit (MLT) and (coordinate phrase of T-unit (CP/T) as well as MLT and clause per T-unit (C/T); furthermore, word counts and error/100 ratio correlated highly with error-free clause totals (EFCT). Issues of syntactical complexity had a negative correlation with EFCT, indicating that more syntactical complexity relates to decreased accuracy. Concerning a difference in error correction between those who self-edited and those who used an online grammar correction tool, results indicated that the variable of errors-free clause ratios (EFCR) had the greatest difference regarding accuracy, with fewer errors noted with writers using an online grammar checker. As for possible differences between the first and second (edited) drafts regarding CAF, results indicated there were positive changes in accuracy, the most significant change seen in complexity (CP/T and MLT), while there were relatively insignificant changes in fluency. Results also indicated significant differences among the three institutions, with Fujian University of Technology having the most fluency and accuracy. These findings suggest that to raise students' awareness of their overall writing development, teachers should support them in developing more complex syntactic structures, improving their fluency, and making more effective use of online grammar checkers.

Keywords: complexity, accuracy, fluency, writing

Procedia PDF Downloads 42
27674 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems

Authors: Lei Chen, Jian Jiao, Tingdi Zhao

Abstract:

Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.

Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system

Procedia PDF Downloads 122
27673 Web Development in Information Technology with Javascript, Machine Learning and Artificial Intelligence

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

Online developers now have the tools necessary to create online apps that are not only reliable but also highly interactive, thanks to the introduction of JavaScript frameworks and APIs. The objective is to give a broad overview of the recent advances in the area. The fusion of machine learning (ML) and artificial intelligence (AI) has expanded the possibilities for web development. Modern websites now include chatbots, clever recommendation systems, and customization algorithms built in. In the rapidly evolving landscape of modern websites, it has become increasingly apparent that user engagement and personalization are key factors for success. To meet these demands, websites now incorporate a range of innovative technologies. One such technology is chatbots, which provide users with instant assistance and support, enhancing their overall browsing experience. These intelligent bots are capable of understanding natural language and can answer frequently asked questions, offer product recommendations, and even help with troubleshooting. Moreover, clever recommendation systems have emerged as a powerful tool on modern websites. By analyzing user behavior, preferences, and historical data, these systems can intelligently suggest relevant products, articles, or services tailored to each user's unique interests. This not only saves users valuable time but also increases the chances of conversions and customer satisfaction. Additionally, customization algorithms have revolutionized the way websites interact with users. By leveraging user preferences, browsing history, and demographic information, these algorithms can dynamically adjust the website's layout, content, and functionalities to suit individual user needs. This level of personalization enhances user engagement, boosts conversion rates, and ultimately leads to a more satisfying online experience. In summary, the integration of chatbots, clever recommendation systems, and customization algorithms into modern websites is transforming the way users interact with online platforms. These advanced technologies not only streamline user experiences but also contribute to increased customer satisfaction, improved conversions, and overall website success.

Keywords: Javascript, machine learning, artificial intelligence, web development

Procedia PDF Downloads 81
27672 Analysis of a Strengthening of a Building Reinforced Concrete Structure

Authors: Nassereddine Attari

Abstract:

Each operation to strengthen or repair requires special consideration and requires the use of methods, tools and techniques appropriate to the situation and specific problems of each of the constructs. The aim of this paper is to study the pathology of building of reinforced concrete towards the earthquake and the vulnerability assessment using a non-linear Pushover analysis and to develop curves for a medium capacity building in order to estimate the damaged condition of the building.

Keywords: pushover analysis, earthquake, damage, strengthening

Procedia PDF Downloads 430
27671 Analytical Derivative: Importance on Environment and Water Analysis/Cycle

Authors: Adesoji Sodeinde

Abstract:

Analytical derivatives has recently undergone an explosive growth in areas of separation techniques, likewise in detectability of certain compound/concentrated ions. The gloomy and depressing scenario which charaterized the application of analytical derivatives in areas of water analysis, water cycle and the environment should not be allowed to continue unabated. Due to technological advancement in various chemical/biochemical analysis separation techniques is widely used in areas of medical, forensic and to measure and assesses environment and social-economic impact of alternative control strategies. This technological improvement was dully established in the area of comparison between certain separation/detection techniques to bring about vital result in forensic[as Gas liquid chromatography reveals the evidence given in court of law during prosecution of drunk drivers]. The water quality analysis,pH and water temperature analysis can be performed in the field, the concentration of dissolved free amino-acid [DFAA] can also be detected through separation techniques. Some important derivatives/ions used in separation technique. Water analysis : Total water hardness [EDTA to determine ca and mg ions]. Gas liquid chromatography : innovative gas such as helium [He] or nitrogen [N] Water cycle : Animal bone charcoal,activated carbon and ultraviolet light [U.V light].

Keywords: analytical derivative, environment, water analysis, chemical/biochemical analysis

Procedia PDF Downloads 339
27670 A Series Solution of Fuzzy Integro-Differential Equation

Authors: Maryam Mosleh, Mahmood Otadi

Abstract:

The hybrid differential equations have a wide range of applications in science and engineering. In this paper, the homotopy analysis method (HAM) is applied to obtain the series solution of the hybrid differential equations. Using the homotopy analysis method, it is possible to find the exact solution or an approximate solution of the problem. Comparisons are made between improved predictor-corrector method, homotopy analysis method and the exact solution. Finally, we illustrate our approach by some numerical example.

Keywords: Fuzzy number, parametric form of a fuzzy number, fuzzy integrodifferential equation, homotopy analysis method

Procedia PDF Downloads 559
27669 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.

Keywords: PWR, HABIT, Habitability, Maanshan

Procedia PDF Downloads 446
27668 Prompt Design for Code Generation in Data Analysis Using Large Language Models

Authors: Lu Song Ma Li Zhi

Abstract:

With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.

Keywords: large language models, prompt design, data analysis, code generation

Procedia PDF Downloads 43
27667 Creation of a Clinical Tool for Diagnosis and Treatment of Skin Disease in HIV Positive Patients in Malawi

Authors: Alice Huffman, Joseph Hartland, Sam Gibbs

Abstract:

Dermatology is often a neglected specialty in low-resource settings, despite the high morbidity associated with skin disease. This becomes even more significant when associated with HIV infection, as dermatological conditions are more common and aggressive in HIV positive patients. African countries have the highest HIV infection rates and skin conditions are frequently misdiagnosed and mismanaged, because of a lack of dermatological training and educational material. The frequent lack of diagnostic tests in the African setting renders basic clinical skills all the more vital. This project aimed to improve diagnosis and treatment of skin disease in the HIV population in a district hospital in Malawi. A basic dermatological clinical tool was developed and produced in collaboration with local staff and based on available literature and data collected from clinics. The aim was to improve diagnostic accuracy and provide guidance for the treatment of skin disease in HIV positive patients. A literature search within Embase, Medline and Google scholar was performed and supplemented through data obtained from attending 5 Antiretroviral clinics. From the literature, conditions were selected for inclusion in the resource if they were described as specific, more prevalent, or extensive in the HIV population or have more adverse outcomes if they develop in HIV patients. Resource-appropriate treatment options were decided using Malawian Ministry of Health guidelines and textbooks specific to African dermatology. After the collection of data and discussion with local clinical and pharmacy staff a list of 15 skin conditions was included and a booklet created using the simple layout of a picture, a diagnostic description of the disease and treatment options. Clinical photographs were collected from local clinics (with full consent of the patient) or from the book ‘Common Skin Diseases in Africa’ (permission granted if fully acknowledged and used in a not-for-profit capacity). This tool was evaluated by the local staff, alongside an educational teaching session on skin disease. This project aimed to reduce uncertainty in diagnosis and provide guidance for appropriate treatment in HIV patients by gathering information into one practical and manageable resource. To further this project, we hope to review the effectiveness of the tool in practice.

Keywords: dermatology, HIV, Malawi, skin disease

Procedia PDF Downloads 206
27666 Meta-Analysis of the Impact of Positive Psychological Capital on Employees Outcomes: The Moderating Role of Tenure

Authors: Hyeondal Jeong, Yoonjung Baek

Abstract:

This research examines the effects of positive psychological capital (or PsyCap) on employee’s outcomes (satisfaction, commitment, organizational citizenship behavior, innovation behavior and individual creativity). This study conducted a meta-analysis of articles published in the Republic of Korea. As a result, positive psychological capital has a positive effect on the behavior of employees. Heterogeneity was identified among the studies included in the analysis and the context factors were analyzed; the study proposes contextual factors such as team tenure. The moderating effect of team tenure was not statistically significant. The implications were discussed based on the analysis results.

Keywords: positive psychological capital , satisfaction, commitment, OCB, creativity, meta-analysis

Procedia PDF Downloads 316
27665 Magnetic Lines of Force and Diamagnetism

Authors: Angel Pérez Sánchez

Abstract:

Magnet attraction or repulsion is not a product of a strange force from afar but comes from anchored lines of force inside the magnet as if it were reinforced concrete since you can move a small block by taking the steel rods that protrude from its interior. This approach serves as a basis for studying the behavior of diamagnetic materials. The significance of this study is to unify all diamagnetic phenomena: Movement of grapes, cooper approaching a magnet, Magnet levitation, etc., with a single explanation for all these phenomena. The method followed has consisted of observation of hundreds of diamagnetism experiments (in copper, aluminum, grapes, tomatoes, and bismuth), including the creation of own and new experiments and application of logical deduction product of these observations. Approaching a magnet to a hanging grape, Diamagnetism seems to consist not only of a slight repulsion but also of a slight attraction at a small distance. Replacing the grapes with a copper sphere, it behaves like the grape, pushing and pulling a nearby magnet. Diamagnetism could be redefined in the following way: There are materials that don't magnetize their internal structure when approaching a magnet, as ferromagnetic materials do. But they do allow magnetic lines of force to run through its interior, enhancing them without creating their own lines of force. Magnet levitates on superconducting ceramics because magnet gives lines near poles a force superior to what a superconductor can enhance these lines. Little further from the magnet, enhancing of lines by the superconductor is greater than the strength provided by the magnet due to the distance from the magnet's pole. It is this point that defines the magnet's levitation band. The anchoring effect of lines is what ultimately keeps the magnet and superconductor at a certain distance. The magnet seeks to levitate the area in which magnetic lines are stronger near de magnet's poles. Pouring ferrofluid into a magnet, lines of force are observed coming out of the poles. On other occasions, diamagnetic materials simply enhance the lines they receive without moving their position since their own weight is greater than the strength of the enhanced lines. (This is the case with grapes and copper). Magnet and diamagnetic materials look for a place where the lines of force are most enhanced, and this is at a small distance. Once the ideal distance is established, they tend to keep it by pushing or pulling on each other. At a certain distance from the magnet: the power exerted by diamagnetic materials is greater than the force of lines in the vicinity of the magnet's poles. All Diamagnetism phenomena: copper, aluminum, grapes, tomatoes, bismuth levitation, and magnet levitation on superconducting ceramics can now be explained with the support of magnetic lines of force.

Keywords: diamagnetism, magnetic levitation, magnetic lines of force, enhancing magnetic lines

Procedia PDF Downloads 92
27664 Efficiency of the Slovak Commercial Banks Applying the DEA Window Analysis

Authors: Iveta Řepková

Abstract:

The aim of this paper is to estimate the efficiency of the Slovak commercial banks employing the Data Envelopment Analysis (DEA) window analysis approach during the period 2003-2012. The research is based on unbalanced panel data of the Slovak commercial banks. Undesirable output was included into analysis of banking efficiency. It was found that most efficient banks were Postovabanka, UniCredit Bank and Istrobanka in CCR model and the most efficient banks were Slovenskasporitelna, Istrobanka and UniCredit Bank in BCC model. On contrary, the lowest efficient banks were found Privatbanka and CitiBank. We found that the largest banks in the Slovak banking market were lower efficient than medium-size and small banks. Results of the paper is that during the period 2003-2008 the average efficiency was increasing and then during the period 2010-2011 the average efficiency decreased as a result of financial crisis.

Keywords: data envelopment analysis, efficiency, Slovak banking sector, window analysis

Procedia PDF Downloads 359