Search results for: hierarchical text classification models
4134 Decolonial Aesthetics in Ronnie Govender’s at the Edge and Other Cato Manor Stories
Authors: Rajendra Chetty
Abstract:
Decolonial aesthetics departs and delinks from colonial ideas about ‘the arts’ and the modernist/colonial work of aesthetics. Education is trapped in the western epistemic and hermeneutical vocabulary, hence it is necessary to introduce new concepts and work the entanglement between co-existing concepts. This paper will discuss the contribution of Ronnie Govender, a South African writer, to build decolonial sensibilities and delink from the grand narrative of the colonial and apartheid literary landscape in Govender’s text, At the Edge and other Cato Manor Stories. Govender uses the world of art to make a decolonial statement. Decolonial artists have to work in the entanglement of power and engage with a border epistemology. Govender’s writings depart from an embodied consciousness of the colonial wound and moves toward healing. Border thinking and doing (artistic creativity) is precisely the decolonial methodology posited by Linda T. Smith, where theory comes in the form of storytelling. Govender’s stories engage with the wounds infringed by racism and patriarchy, two pillars of eurocentric knowing, sensing, and believing that sustain a structure of knowledge. This structure is embedded in characters, institutions, languages that regulate and mange the world of the excluded. Healing is the process of delinking, or regaining pride, dignity, and humanity, not through the psychoanalytic cure, but the popular healer. The legacies of the community of Cato Manor that was pushed out of their land are built in his stories. Decoloniality then is a concept that carries the experience of liberation struggles and recognizes the strenuous conditions of marginalized people together with their strength, wisdom, and endurance. Govender’s unique performative prose reconstructs and resurrects the lives of the people of Cato Manor, their vitality and humor, pain and humiliation: a vibrant and racially integrated community destroyed by the regime’s notorious racial laws. The paper notes that Govender’s objective with his plays and stories was to open windows to both the pain and joy of life; a mission that is not didactic but to shine a torch on both mankind’s waywardness as well as its inspiring and often moving achievements against huge odds.Keywords: Govender, decoloniality, delinking, exclusion, racism, Cato Manor
Procedia PDF Downloads 1574133 Characterization of Atmospheric Aerosols by Developing a Cascade Impactor
Authors: Sapan Bhatnagar
Abstract:
Micron size particles emitted from different sources and produced by combustion have serious negative effects on human health and environment. They can penetrate deep into our lungs through the respiratory system. Determination of the amount of particulates present in the atmosphere per cubic meter is necessary to monitor, regulate and model atmospheric particulate levels. Cascade impactor is used to collect the atmospheric particulates and by gravimetric analysis, their concentration in the atmosphere of different size ranges can be determined. Cascade impactors have been used for the classification of particles by aerodynamic size. They operate on the principle of inertial impaction. It consists of a number of stages each having an impaction plate and a nozzle. Collection plates are connected in series with smaller and smaller cutoff diameter. Air stream passes through the nozzle and the plates. Particles in the stream having large enough inertia impact upon the plate and smaller particles pass onto the next stage. By designing each successive stage with higher air stream velocity in the nozzle, smaller diameter particles will be collected at each stage. Particles too small to be impacted on the last collection plate will be collected on a backup filter. Impactor consists of 4 stages each made of steel, having its cut-off diameters less than 10 microns. Each stage is having collection plates, soaked with oil to prevent bounce and allows the impactor to function at high mass concentrations. Even after the plate is coated with particles, the incoming particle will still have a wet surface which significantly reduces particle bounce. The particles that are too small to be impacted on the last collection plate are then collected on a backup filter (microglass fiber filter), fibers provide larger surface area to which particles may adhere and voids in filter media aid in reducing particle re-entrainment.Keywords: aerodynamic diameter, cascade, environment, particulates, re-entrainment
Procedia PDF Downloads 3204132 An Overview of New Era in Food Science and Technology
Authors: Raana Babadi Fathipour
Abstract:
Strict prerequisites of logical diaries united ought to demonstrate the exploratory information is (in)significant from the statistical point of view and has driven a soak increment within the utilization and advancement of the factual program. It is essential that the utilization of numerical and measurable strategies, counting chemometrics and many other factual methods/algorithms in nourishment science and innovation has expanded steeply within the final 20 a long time. Computational apparatuses accessible can be utilized not as it were to run factual investigations such as univariate and bivariate tests as well as multivariate calibration and improvement of complex models but also to run reenactments of distinctive scenarios considering a set of inputs or essentially making expectations for particular information sets or conditions. Conducting a fast look within the most legitimate logical databases (Pubmed, ScienceDirect, Scopus), it is conceivable to watch that measurable strategies have picked up a colossal space in numerous regions.Keywords: food science, food technology, food safety, computational tools
Procedia PDF Downloads 674131 Quality Analysis of Vegetables Through Image Processing
Authors: Abdul Khalique Baloch, Ali Okatan
Abstract:
The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria
Procedia PDF Downloads 704130 Exercise and Geriatric Depression: a Scoping Review of the Research Evidence
Authors: Samira Mehrabi
Abstract:
Geriatric depression is a common late-life mental health disorder that increases morbidity and mortality. It has been shown that exercise is effective in alleviating symptoms of geriatric depression. However, inconsistencies across studies and lack of optimal dose-response of exercise for improving geriatric depression have made it challenging to draw solid conclusions on the effectiveness of exercise in late-life depression. Purpose: To further investigate the moderators of the effectiveness of exercise on geriatric depression across the current body of evidence. Methods: Based on the Arksey and O’Malley framework, an extensive search strategy was performed by exploring PubMed, Scopus, Sport Discus, PsycInfo, ERIC, and IBSS without limitations in the time frame. Eight systematic reviews with empirical results that evaluated the effect of exercise on depression among people aged ≥ 60 years were identified and their individual studies were screened for inclusion. One additional study was found through the hand searching of reference lists. After full-text screening and applying inclusion and exclusion criteria, 21 studies were retained for inclusion. Results: The review revealed high variability in characteristics of the exercise interventions and outcome measures. Sample characteristics, nature of comparators, main outcome assessment, and baseline severity of depression also varied notably. Mind-body and aerobic exercises were found to significantly reduce geriatric depression. However, results on the relationship between resistance training and improvements in geriatric depression were inconsistent, and results of the intensity-related antidepressant effects of exercise interventions were mixed. Extensive use of self-reported questionnaires for the main outcome assessment and lack of evidence on the relationship between depression severity and observed effects were of the other important highlights of the review. Conclusion: Several literature gaps were found regarding the potential effect modifiers of exercise and geriatric depression. While acknowledging the complexity of establishing recommendations on the exercise variables and geriatric depression, future studies are required to understand the interplay and threshold effect of exercise for treating geriatric depression.Keywords: exercise, geriatric depression, healthy aging, older adults, physical activity intervention, scoping review
Procedia PDF Downloads 1074129 Conceptualizing of Priorities in the Dynamics of Public Administration Contemporary Reforms
Authors: Larysa Novak-Kalyayeva, Aleksander Kuczabski, Orystlava Sydorchuk, Nataliia Fersman, Tatyana Zemlinskaia
Abstract:
The article presents the results of the creative analysis and comparison of trends in the development of the theory of public administration during the period from the second half of the 20th to the beginning of the 21st century. The process of conceptualization of the priorities of public administration in the dynamics of reforming was held under the influence of such factors as globalization, integration, information and technological changes and human rights is examined. The priorities of the social state in the concepts of the second half of the 20th century are studied. Peculiar approaches to determining the priorities of public administration in the countries of "Soviet dictatorship" in Central and Eastern Europe in the same period are outlined. Particular attention is paid to the priorities of public administration regarding the interaction between public power and society and the development of conceptual foundations for the modern managerial process. There is a thought that the dynamics of the formation of concepts of the European governance is characterized by the sequence of priorities: from socio-economic and moral-ethical to organizational-procedural and non-hierarchical ones. The priorities of the "welfare state" were focused on the decent level of material wellbeing of population. At the same time, the conception of "minimal state" emphasized priorities of human responsibility for their own fate under the conditions of minimal state protection. Later on, the emphasis was placed on horizontal ties and redistribution of powers and competences of "effective state" with its developed procedures and limits of responsibility at all levels of government and in close cooperation with the civil society. The priorities of the contemporary period are concentrated on human rights in the concepts of "good governance" and all the following ones, which recognize the absolute priority of public administration with compliance, provision and protection of human rights. There is a proved point of view that civilizational changes taking place under the influence of information and technological imperatives also stipulate changes in priorities, redistribution of emphases and update principles of managerial concepts on the basis of publicity, transparency, departure from traditional forms of hierarchy and control in favor of interactivity and inter-sectoral interaction, decentralization and humanization of managerial processes. The necessity to permanently carry out the reorganization, by establishing the interaction between different participants of public power and social relations, to establish a balance between political forces and social interests on the basis of mutual trust and mutual understanding determines changes of social, political, economic and humanitarian paradigms of public administration and their theoretical comprehension. The further studies of theoretical foundations of modern public administration in interdisciplinary discourse in the context of ambiguous consequences of the globalizational and integrational processes of modern European state-building would be advisable. This is especially true during the period of political transformations and economic crises which are the characteristic of the contemporary Europe, especially for democratic transition countries.Keywords: concepts of public administration, democratic transition countries, human rights, the priorities of public administration, theory of public administration
Procedia PDF Downloads 1744128 A Conceptual Framework of Digital Twin for Homecare
Authors: Raja Omman Zafar, Yves Rybarczyk, Johan Borg
Abstract:
This article proposes a conceptual framework for the application of digital twin technology in home care. The main goal is to bridge the gap between advanced digital twin concepts and their practical implementation in home care. This study uses a literature review and thematic analysis approach to synthesize existing knowledge and proposes a structured framework suitable for homecare applications. The proposed framework integrates key components such as IoT sensors, data-driven models, cloud computing, and user interface design, highlighting the importance of personalized and predictive homecare solutions. This framework can significantly improve the efficiency, accuracy, and reliability of homecare services. It paves the way for the implementation of digital twins in home care, promoting real-time monitoring, early intervention, and better outcomes.Keywords: digital twin, homecare, older adults, healthcare, IoT, artificial intelligence
Procedia PDF Downloads 714127 Simulation Model of Induction Heating in COMSOL Multiphysics
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element
Procedia PDF Downloads 3164126 Hydrological-Economic Modeling of Two Hydrographic Basins of the Coast of Peru
Authors: Julio Jesus Salazar, Manuel Andres Jesus De Lama
Abstract:
There are very few models that serve to analyze the use of water in the socio-economic process. On the supply side, the joint use of groundwater has been considered in addition to the simple limits on the availability of surface water. In addition, we have worked on waterlogging and the effects on water quality (mainly salinity). In this paper, a 'complex' water economy is examined; one in which demands grow differentially not only within but also between sectors, and one in which there are limited opportunities to increase consumptive use. In particular, high-value growth, the growth of the production of irrigated crops of high value within the basins of the case study, together with the rapidly growing urban areas, provides a rich context to examine the general problem of water management at the basin level. At the same time, the long-term aridity of nature has made the eco-environment in the basins located on the coast of Peru very vulnerable, and the exploitation and immediate use of water resources have further deteriorated the situation. The presented methodology is the optimization with embedded simulation. The wide basin simulation of flow and water balances and crop growth are embedded with the optimization of water allocation, reservoir operation, and irrigation scheduling. The modeling framework is developed from a network of river basins that includes multiple nodes of origin (reservoirs, aquifers, water courses, etc.) and multiple demand sites along the river, including places of consumptive use for agricultural, municipal and industrial, and uses of running water on the coast of Peru. The economic benefits associated with water use are evaluated for different demand management instruments, including water rights, based on the production and benefit functions of water use in the urban agricultural and industrial sectors. This work represents a new effort to analyze the use of water at the regional level and to evaluate the modernization of the integrated management of water resources and socio-economic territorial development in Peru. It will also allow the establishment of policies to improve the process of implementation of the integrated management and development of water resources. The input-output analysis is essential to present a theory about the production process, which is based on a particular type of production function. Also, this work presents the Computable General Equilibrium (CGE) version of the economic model for water resource policy analysis, which was specifically designed for analyzing large-scale water management. As to the platform for CGE simulation, GEMPACK, a flexible system for solving CGE models, is used for formulating and solving CGE model through the percentage-change approach. GEMPACK automates the process of translating the model specification into a model solution program.Keywords: water economy, simulation, modeling, integration
Procedia PDF Downloads 1554125 Role of Endotherapy vs Surgery in the Management of Traumatic Pancreatic Injury: A Tertiary Center Experience
Authors: Thinakar Mani Balusamy, Ratnakar S. Kini, Bharat Narasimhan, Venkateswaran A. R, Pugazhendi Thangavelu, Mohammed Ali, Prem Kumar K., Kani Sheikh M., Sibi Thooran Karmegam, Radhakrishnan N., Mohammed Noufal
Abstract:
Introduction: Pancreatic injury remains a complicated condition requiring an individualized case by case approach to management. In this study, we aim to analyze the varied presentations and treatment outcomes of traumatic pancreatic injury in a tertiary care center. Methods: All consecutive patients hospitalized at our center with traumatic pancreatic injury between 2013 and 2017 were included. The American Association for Surgery of Trauma (AAST) classification was used to stratify patients into five grades of severity. Outcome parameters were then analyzed based on the treatment modality employed. Results: Of the 35 patients analyzed, 26 had an underlying blunt trauma with the remaining nine presenting due to penetrating injury. Overall in-hospital mortality was 28%. 19 of these patients underwent exploratory laparotomy with the remaining 16 managed nonoperatively. Nine patients had a severe injury ( > grade 3) – of which four underwent endotherapy, three had stents placed and one underwent an endoscopic pseudocyst drainage. Among those managed nonoperatively, three underwent a radiological drainage procedure. Conclusion: Mortality rates were clearly higher in patients managed operatively. This is likely a result of significantly higher degrees of major associated non-pancreatic injuries and not just a reflection of surgical morbidity. Despite this, surgical management remains the mainstay of therapy, especially in higher grades of pancreatic injury. However we would like to emphasize that endoscopic intervention definitely remains the preferred treatment modality when the clinical setting permits. This is especially applicable in cases of main pancreatic duct injury with ascites as well as pseudocysts.Keywords: endotherapy, non-operative management, surgery, traumatic pancreatic injury
Procedia PDF Downloads 2074124 Continuous and Discontinuos Modeling of Wellbore Instability in Anisotropic Rocks
Authors: C. Deangeli, P. Obentaku Obenebot, O. Omwanghe
Abstract:
The study focuses on the analysis of wellbore instability in rock masses affected by weakness planes. The occurrence of failure in such a type of rocks can occur in the rock matrix and/ or along the weakness planes, in relation to the mud weight gradient. In this case the simple Kirsch solution coupled with a failure criterion cannot supply a suitable scenario for borehole instabilities. Two different numerical approaches have been used in order to investigate the onset of local failure at the wall of a borehole. For each type of approach the influence of the inclination of weakness planes has been investigates, by considering joint sets at 0°, 35° and 90° to the horizontal. The first set of models have been carried out with FLAC 2D (Fast Lagrangian Analysis of Continua) by considering the rock material as a continuous medium, with a Mohr Coulomb criterion for the rock matrix and using the ubiquitous joint model for accounting for the presence of the weakness planes. In this model yield may occur in either the solid or along the weak plane, or both, depending on the stress state, the orientation of the weak plane and the material properties of the solid and weak plane. The second set of models have been performed with PFC2D (Particle Flow code). This code is based on the Discrete Element Method and considers the rock material as an assembly of grains bonded by cement-like materials, and pore spaces. The presence of weakness planes is simulated by the degradation of the bonds between grains along given directions. In general the results of the two approaches are in agreement. However the discrete approach seems to capture more complex phenomena related to local failure in the form of grain detachment at wall of the borehole. In fact the presence of weakness planes in the discontinuous medium leads to local instability along the weak planes also in conditions not predicted from the continuous solution. In general slip failure locations and directions do not follow the conventional wellbore breakout direction but depend upon the internal friction angle and the orientation of the bedding planes. When weakness plane is at 0° and 90° the behaviour are similar to that of a continuous rock material, but borehole instability is more severe when weakness planes are inclined at an angle between 0° and 90° to the horizontal. In conclusion, the results of the numerical simulations show that the prediction of local failure at the wall of the wellbore cannot disregard the presence of weakness planes and consequently the higher mud weight required for stability for any specific inclination of the joints. Despite the discrete approach can simulate smaller areas because of the large number of particles required for the generation of the rock material, however it seems to investigate more correctly the occurrence of failure at the miscroscale and eventually the propagation of the failed zone to a large portion of rock around the wellbore.Keywords: continuous- discontinuous, numerical modelling, weakness planes wellbore, FLAC 2D
Procedia PDF Downloads 4994123 Land Use Dynamics of Ikere Forest Reserve, Nigeria Using Geographic Information System
Authors: Akintunde Alo
Abstract:
The incessant encroachments into the forest ecosystem by the farmers and local contractors constitute a major threat to the conservation of genetic resources and biodiversity in Nigeria. To propose a viable monitoring system, this study employed Geographic Information System (GIS) technology to assess the changes that occurred for a period of five years (between 2011 and 2016) in Ikere forest reserve. Landsat imagery of the forest reserve was obtained. For the purpose of geo-referencing the acquired satellite imagery, ground-truth coordinates of some benchmark places within the forest reserve was relied on. Supervised classification algorithm, image processing, vectorization and map production were realized using ArcGIS. Various land use systems within the forest ecosystem were digitized into polygons of different types and colours for 2011 and 2016, roads were represented with lines of different thickness and colours. Of the six land-use delineated, the grassland increased from 26.50 % in 2011 to 45.53% in 2016 of the total land area with a percentage change of 71.81 %. Plantations of Gmelina arborea and Tectona grandis on the other hand reduced from 62.16 % in 2011 to 27.41% in 2016. The farmland and degraded land recorded percentage change of about 176.80 % and 8.70 % respectively from 2011 to 2016. Overall, the rate of deforestation in the study area is on the increase and becoming severe. About 72.59% of the total land area has been converted to non-forestry uses while the remnant 27.41% is occupied by plantations of Gmelina arborea and Tectona grandis. Interestingly, over 55 % of the plantation area in 2011 has changed to grassland, or converted to farmland and degraded land in 2016. The rate of change over time was about 9.79 % annually. Based on the results, rapid actions to prevail on the encroachers to stop deforestation and encouraged re-afforestation in the study area are recommended.Keywords: land use change, forest reserve, satellite imagery, geographical information system
Procedia PDF Downloads 3574122 Naphtha Catalytic Reform: Modeling and Simulation of Unity
Authors: Leal Leonardo, Pires Carlos Augusto de Moraes, Casiraghi Magela
Abstract:
In this work were realized the modeling and simulation of the catalytic reformer process, of ample form, considering all the equipment that influence the operation performance. Considered it a semi-regenerative reformer, with four reactors in series intercalated with four furnaces, two heat exchanges, one product separator and one recycle compressor. A simplified reactional system was considered, involving only ten chemical compounds related through five reactions. The considered process was the applied to aromatics production (benzene, toluene, and xylene). The models developed to diverse equipment were interconnecting in a simulator that consists of a computer program elaborate in FORTRAN 77. The simulation of the global model representative of reformer unity achieved results that are compatibles with the literature ones. It was then possible to study the effects of operational variables in the products concentration and in the performance of the unity equipment.Keywords: catalytic reforming, modeling, simulation, petrochemical engineering
Procedia PDF Downloads 5164121 Estimating Big Five Personality Expressions with a Tiered Information Framework
Authors: Laura Kahn, Paul Rodrigues, Onur Savas, Shannon Hahn
Abstract:
An empirical understanding of an individual's personality expression can have a profound impact on organizations seeking to strengthen team performance and improve employee retention. A team's personality composition can impact overall performance. Creating a tiered information framework that leverages proxies for a user's social context and lexical and linguistic content provides insight into location-specific personality expression. We leverage the layered framework to examine domain-specific, psychological, and lexical cues within social media posts. We apply DistilBERT natural language transfer learning models with real world data to examine the relationship between Big Five personality expressions of people in Science, Technology, Engineering and Math (STEM) fields.Keywords: big five, personality expression, social media analysis, workforce development
Procedia PDF Downloads 1394120 Cubic Trigonometric B-Spline Approach to Numerical Solution of Wave Equation
Authors: Shazalina Mat Zin, Ahmad Abd. Majid, Ahmad Izani Md. Ismail, Muhammad Abbas
Abstract:
The generalized wave equation models various problems in sciences and engineering. In this paper, a new three-time level implicit approach based on cubic trigonometric B-spline for the approximate solution of wave equation is developed. The usual finite difference approach is used to discretize the time derivative while cubic trigonometric B-spline is applied as an interpolating function in the space dimension. Von Neumann stability analysis is used to analyze the proposed method. Two problems are discussed to exhibit the feasibility and capability of the method. The absolute errors and maximum error are computed to assess the performance of the proposed method. The results were found to be in good agreement with known solutions and with existing schemes in literature.Keywords: collocation method, cubic trigonometric B-spline, finite difference, wave equation
Procedia PDF Downloads 5424119 Comparison of Equivalent Linear and Non-Linear Site Response Model Performance in Kathmandu Valley
Authors: Sajana Suwal, Ganesh R. Nhemafuki
Abstract:
Evaluation of ground response under earthquake shaking is crucial in geotechnical earthquake engineering. Damage due to seismic excitation is mainly correlated to local geological and geotechnical conditions. It is evident from the past earthquakes (e.g. 1906 San Francisco, USA, 1923 Kanto, Japan) that the local geology has strong influence on amplitude and duration of ground motions. Since then significant studies has been conducted on ground motion amplification revealing the importance of influence of local geology on ground. Observations from the damaging earthquakes (e.g. Nigata and San Francisco, 1964; Irpinia, 1980; Mexico, 1985; Kobe, 1995; L’Aquila, 2009) divulged that non-uniform damage pattern, particularly in soft fluvio-lacustrine deposit is due to the local amplification of seismic ground motion. Non-uniform damage patterns are also observed in Kathmandu Valley during 1934 Bihar Nepal earthquake and recent 2015 Gorkha earthquake seemingly due to the modification of earthquake ground motion parameters. In this study, site effects resulting from amplification of soft soil in Kathmandu are presented. A large amount of subsoil data was collected and used for defining the appropriate subsoil model for the Kathamandu valley. A comparative study of one-dimensional total-stress equivalent linear and non-linear site response is performed using four strong ground motions for six sites of Kathmandu valley. In general, one-dimensional (1D) site-response analysis involves the excitation of a soil profile using the horizontal component and calculating the response at individual soil layers. In the present study, both equivalent linear and non-linear site response analyses were conducted using the computer program DEEPSOIL. The results show that there is no significant deviation between equivalent linear and non-linear site response models until the maximum strain reaches to 0.06-0.1%. Overall, it is clearly observed from the results that non-linear site response model perform better as compared to equivalent linear model. However, the significant deviation between two models is resulted from other influencing factors such as assumptions made in 1D site response, lack of accurate values of shear wave velocity and nonlinear properties of the soil deposit. The results are also presented in terms of amplification factors which are predicted to be around four times more in case of non-linear analysis as compared to equivalent linear analysis. Hence, the nonlinear behavior of soil prevails the urgent need of study of dynamic characteristics of the soft soil deposit that can specifically represent the site-specific design spectra for the Kathmandu valley for building resilient structures from future damaging earthquakes.Keywords: deep soil, equivalent linear analysis, non-linear analysis, site response
Procedia PDF Downloads 2914118 A Consideration of Dialectal and Stylistic Shifts in Literary Translation
Authors: Pushpinder Syal
Abstract:
Literary writing carries the stamp of the current language of its time. In translating such texts, it becomes a challenge to capture such reflections which may be evident at several levels: the level of dialectal use of language by characters in stories, the alterations in syntax as tools of writers’ individual stylistic choices, the insertion of quasi-proverbial and gnomic utterances, and even the level of the pragmatics of narrative discourse. Discourse strategies may differ between earlier and later texts, reflecting changing relationships between narrators and readers in changed cultural and social contexts. This paper is a consideration of these features by an approach that combines historicity with a description, contextualizing language change within a discourse framework. The process of translating a collection of writings of Punjabi literature spanning 100 years was undertaken for this study and it was observed that the factor of the historicity of language was seen to play a role. While intended for contemporary readers, the translation of literature over the span of a century poses the dual challenge of needing to possess both accessibility and immediacy as well as adherence to the 'old world' styles of communicating and narrating. The linguistic changes may be observed in a more obvious sense in the difference of diction and word formation – with evidence of more hybridized and borrowed forms in modern and contemporary writings, as compared to the older writings. The latter not only contain vestiges of proverbs and folk sayings, but are also closer to oral speech styles. These will be presented and analysed in the form of chronological listing and by these means, the social process of translation from orality to written text can be seen as traceable in the above-mentioned works. More subtle and underlying shifts can be seen through the analysis of speech acts and implicatures in the same literature, in which the social relationships underlying language use are evident as discourse systems of belief and understanding. They present distinct shifts in worldview as seen at different points in time. However, some continuities of language and style are also clearly visible, and these aid the translator in putting together a set of thematic links which identify the literature of a region and community, and constitute essential outcomes in the effort to preserve its distinctive nature.Keywords: cultural change, dialect, historicity, stylistic variation
Procedia PDF Downloads 1304117 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 1294116 Using Combination of Sets of Features of Molecules for Aqueous Solubility Prediction: A Random Forest Model
Authors: Muhammet Baldan, Emel Timuçin
Abstract:
Generally, absorption and bioavailability increase if solubility increases; therefore, it is crucial to predict them in drug discovery applications. Molecular descriptors and Molecular properties are traditionally used for the prediction of water solubility. There are various key descriptors that are used for this purpose, namely Drogan Descriptors, Morgan Descriptors, Maccs keys, etc., and each has different prediction capabilities with differentiating successes between different data sets. Another source for the prediction of solubility is structural features; they are commonly used for the prediction of solubility. However, there are little to no studies that combine three or more properties or descriptors for prediction to produce a more powerful prediction model. Unlike available models, we used a combination of those features in a random forest machine learning model for improved solubility prediction to better predict and, therefore, contribute to drug discovery systems.Keywords: solubility, random forest, molecular descriptors, maccs keys
Procedia PDF Downloads 464115 Managers’ Mobile Information Behavior in an Openness Paradigm Era
Authors: Abd Latif Abdul Rahman, Zuraidah Arif, Muhammad Faizal Iylia, Mohd Ghazali, Asmadi Mohammed Ghazali
Abstract:
Mobile information is a significant access point for human information activities. Theories and models of human information behavior have developed over several decades but have not yet considered the role of the user’s computing device in digital information interactions. This paper reviews the literature that leads to developing a conceptual framework of a study on the managers mobile information behavior. Based on the literature review, dimensions of mobile information behavior are identified, namely, dimension information needs, dimension information access, information retrieval and dimension of information use. The study is significant to understand the nature of librarians’ behavior in searching, retrieving and using information via the mobile device. Secondly, the study would provide suggestions about various kinds of mobile applications which organization can provide for their staff to improve their services.Keywords: mobile information behavior, information behavior, mobile information, mobile devices
Procedia PDF Downloads 3494114 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel
Authors: Tarek Litim, Ouahiba Taamallah
Abstract:
The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA
Procedia PDF Downloads 1914113 Forecasting Issues in Energy Markets within a Reg-ARIMA Framework
Authors: Ilaria Lucrezia Amerise
Abstract:
Electricity markets throughout the world have undergone substantial changes. Accurate, reliable, clear and comprehensible modeling and forecasting of different variables (loads and prices in the first instance) have achieved increasing importance. In this paper, we describe the actual state of the art focusing on reg-SARMA methods, which have proven to be flexible enough to accommodate the electricity price/load behavior satisfactory. More specifically, we will discuss: 1) The dichotomy between point and interval forecasts; 2) The difficult choice between stochastic (e.g. climatic variation) and non-deterministic predictors (e.g. calendar variables); 3) The confrontation between modelling a single aggregate time series or creating separated and potentially different models of sub-series. The noteworthy point that we would like to make it emerge is that prices and loads require different approaches that appear irreconcilable even though must be made reconcilable for the interests and activities of energy companies.Keywords: interval forecasts, time series, electricity prices, reg-SARIMA methods
Procedia PDF Downloads 1314112 Synaesthetic Metaphors in Persian: a Cognitive Corpus Based and Comparative Perspective
Authors: A. Afrashi
Abstract:
Introduction: Synaesthesia is a term denoting the perception or description of the perception of one sense modality in terms of another. In literature, synaesthesia refers to a technique adopted by writers to present ideas, characters or places in such a manner that they appeal to more than one sense like hearing, seeing, smell etc. at a given time. In everyday language too we find many examples of synaesthesia. We commonly hear phrases like ‘loud colors’, ‘frozen silence’ and ‘warm colors’, ‘bitter cold’ etc. Empirical cognitive studies have proved that synaesthetic representations both in literature and everyday languages are constrained ie. they do not map randomly among sensory domains. From the beginning of the 20th century Synaesthesia has been a research domain both in literature and structural linguistics. However the exploration of cognitive mechanisms motivating synaesthesia, have made it an important topic in 21st century cognitive linguistics and literary studies. Synaesthetic metaphors are linguistic representations of those mental mechanisms, the study of which reveals invaluable facts about perception, cognition and conceptualization. According to the main tenets of cognitive approach to language and literature, unified and similar cognitive mechanisms are active both in everyday language and literature, and synaesthesia is one of those cognitive mechanisms. Main objective of the present research is to answer the following questions: What types of sense transfers are accessible in Persian synaesthetic metaphors. How are these types of sense transfers cognitively explained. What are the results of cross-linguistic comparative study of synaestetic metaphors based on the existing observations? Methodology: The present research employs a cognitive - corpus based method, and the theoretical framework adopted to analyze linguistic synaesthesia is the contemporary theory of metaphor, where conceptual metaphor is the result of systemic mappings across cognitive domains. Persian Language Data- base (PLDB) in the Institute for Humanities and Cultural Studies which consists mainly of Persian modern prose, is searched for synaesthetic metaphors. Then for each metaphorical structure, the source and target domains are determined. Then sense transfers are identified and the types of synaesthetic metaphors recognized. Findings: Persian synaesthetic metaphors conform to the hierarchical distribution principle, according to which transfers tend to go from touch to taste to smell to sound and to sight, not vice versa. In other words mapping from more accessible or basic concepts onto less accessible or less basic ones seems more natural. Furthermore the most frequent target domain in Persian synaesthetic metaphors is sound. Certain characteristics of Persian synaesthetic metaphors are comparable with existing related researches carried on English, French, Hungarian and Chinese synaesthetic metaphors. Conclusion: Cognitive corpus based approaches to linguistic synaesthesia, are applicable to stylistics and literary criticism and this recent research domain is an efficient approach to study cross linguistic variations to find out which of the five senses is dominant cross linguistically and cross culturally as the target domain in metaphorical mappings , and so forth receiving dominance in conceptualizations.Keywords: cognitive semantics, conceptual metaphor, synaesthesia, corpus based approach
Procedia PDF Downloads 5624111 Secure Optical Communication System Using Quantum Cryptography
Authors: Ehab AbdulRazzaq Hussein
Abstract:
Quantum cryptography (QC) is an emerging technology for secure key distribution with single-photon transmissions. In contrast to classical cryptographic schemes, the security of QC schemes is guaranteed by the fundamental laws of nature. Their security stems from the impossibility to distinguish non-orthogonal quantum states with certainty. A potential eavesdropper introduces errors in the transmissions, which can later be discovered by the legitimate participants of the communication. In this paper, the modeling approach is proposed for QC protocol BB84 using polarization coding. The single-photon system is assumed to be used in the designed models. Thus, Eve cannot use beam-splitting strategy to eavesdrop on the quantum channel transmission. The only eavesdropping strategy possible to Eve is the intercept/resend strategy. After quantum transmission of the QC protocol, the quantum bit error rate (QBER) is estimated and compared with a threshold value. If it is above this value the procedure must be stopped and performed later again.Keywords: security, key distribution, cryptography, quantum protocols, Quantum Cryptography (QC), Quantum Key Distribution (QKD).
Procedia PDF Downloads 4064110 Probing Mechanical Mechanism of Three-Hinge Formation on a Growing Brain: A Numerical and Experimental Study
Authors: Mir Jalil Razavi, Tianming Liu, Xianqiao Wang
Abstract:
Cortical folding, characterized by convex gyri and concave sulci, has an intrinsic relationship to the brain’s functional organization. Understanding the mechanism of the brain’s convoluted patterns can provide useful clues into normal and pathological brain function. During the development, the cerebral cortex experiences a noticeable expansion in volume and surface area accompanied by tremendous tissue folding which may be attributed to many possible factors. Despite decades of endeavors, the fundamental mechanism and key regulators of this crucial process remain incompletely understood. Therefore, to taking even a small role in unraveling of brain folding mystery, we present a mechanical model to find mechanism of 3-hinges formation in a growing brain that it has not been addressed before. A 3-hinge is defined as a gyral region where three gyral crests (hinge-lines) join. The reasons that how and why brain prefers to develop 3-hinges have not been answered very well. Therefore, we offer a theoretical and computational explanation to mechanism of 3-hinges formation in a growing brain and validate it by experimental observations. In theoretical approach, the dynamic behavior of brain tissue is examined and described with the aid of a large strain and nonlinear constitutive model. Derived constitute model is used in the computational model to define material behavior. Since the theoretical approach cannot predict the evolution of cortical complex convolution after instability, non-linear finite element models are employed to study the 3-hinges formation and secondary morphological folds of the developing brain. Three-dimensional (3D) finite element analyses on a multi-layer soft tissue model which mimics a small piece of the brain are performed to investigate the fundamental mechanism of consistent hinge formation in the cortical folding. Results show that after certain amount growth of cortex, mechanical model starts to be unstable and then by formation of creases enters to a new configuration with lower strain energy. By further growth of the model, formed shallow creases start to form convoluted patterns and then develop 3-hinge patterns. Simulation results related to 3-hinges in models show good agreement with experimental observations from macaque, chimpanzee and human brain images. These results have great potential to reveal fundamental principles of brain architecture and to produce a unified theoretical framework that convincingly explains the intrinsic relationship between cortical folding and 3-hinges formation. This achieved fundamental understanding of the intrinsic relationship between cortical folding and 3-hinges formation would potentially shed new insights into the diagnosis of many brain disorders such as schizophrenia, autism, lissencephaly and polymicrogyria.Keywords: brain, cortical folding, finite element, three hinge
Procedia PDF Downloads 2364109 A Comparison of Neural Network and DOE-Regression Analysis for Predicting Resource Consumption of Manufacturing Processes
Authors: Frank Kuebler, Rolf Steinhilper
Abstract:
Artificial neural networks (ANN) as well as Design of Experiments (DOE) based regression analysis (RA) are mainly used for modeling of complex systems. Both methodologies are commonly applied in process and quality control of manufacturing processes. Due to the fact that resource efficiency has become a critical concern for manufacturing companies, these models needs to be extended to predict resource-consumption of manufacturing processes. This paper describes an approach to use neural networks as well as DOE based regression analysis for predicting resource consumption of manufacturing processes and gives a comparison of the achievable results based on an industrial case study of a turning process.Keywords: artificial neural network, design of experiments, regression analysis, resource efficiency, manufacturing process
Procedia PDF Downloads 5244108 Solutions of Fractional Reaction-Diffusion Equations Used to Model the Growth and Spreading of Biological Species
Authors: Kamel Al-Khaled
Abstract:
Reaction-diffusion equations are commonly used in population biology to model the spread of biological species. In this paper, we propose a fractional reaction-diffusion equation, where the classical second derivative diffusion term is replaced by a fractional derivative of order less than two. Based on the symbolic computation system Mathematica, Adomian decomposition method, developed for fractional differential equations, is directly extended to derive explicit and numerical solutions of space fractional reaction-diffusion equations. The fractional derivative is described in the Caputo sense. Finally, the recent appearance of fractional reaction-diffusion equations as models in some fields such as cell biology, chemistry, physics, and finance, makes it necessary to apply the results reported here to some numerical examples.Keywords: fractional partial differential equations, reaction-diffusion equations, adomian decomposition, biological species
Procedia PDF Downloads 3754107 Automated 3D Segmentation System for Detecting Tumor and Its Heterogeneity in Patients with High Grade Ovarian Epithelial Cancer
Authors: Dimitrios Binas, Marianna Konidari, Charis Bourgioti, Lia Angela Moulopoulou, Theodore Economopoulos, George Matsopoulos
Abstract:
High grade ovarian epithelial cancer (OEC) is fatal gynecological cancer and the poor prognosis of this entity is closely related to considerable intratumoral genetic heterogeneity. By examining imaging data, it is possible to assess the heterogeneity of tumorous tissue. This study proposes a methodology for aligning, segmenting and finally visualizing information from various magnetic resonance imaging series in order to construct 3D models of heterogeneity maps from the same tumor in OEC patients. The proposed system may be used as an adjunct digital tool by health professionals for personalized medicine, as it allows for an easy visual assessment of the heterogeneity of the examined tumor.Keywords: image segmentation, ovarian epithelial cancer, quantitative characteristics, image registration, tumor visualization
Procedia PDF Downloads 2134106 Comparative Correlation Investigation of Polynuclear Aromatic Hydrocarbons (PAHs) in Soils of Different Land Uses: Sources Evaluation Perspective
Authors: O. Onoriode Emoyan, E. Eyitemi Akporhonor, Charles Otobrise
Abstract:
Polycyclic Aromatic Hydrocarbons (PAHs) are formed mainly as a result of incomplete combustion of organic materials during industrial, domestic activities or natural occurrence. Their toxicity and contamination of terrestrial and aquatic ecosystem have been established. Though with limited validity index, previous research has focused on PAHs isomer pair ratios of variable physicochemical properties in source identification. The objective of this investigation was to determine the empirical validity of Pearson correlation coefficient (PCC) and cluster analysis (CA) in PAHs source identification along soil samples of different land uses. Therefore, 16 PAHs grouped as endocrine disruption substances (EDSs) were determined in 10 sample stations in top and sub soils seasonally. PAHs was determined the use of Varian 300 gas chromatograph interfaced with flame ionization detector. Instruments and reagents used are of standard and chromatographic grades respectively. PCC and CA results showed that the classification of PAHs along kinetically and thermodyanamically-favoured and those derived directly from plants product through biologically mediated processes used in source signature is about the predominance PAHs are likely to be. Therefore the observed PAHs in the studied stations have trace quantities of the vast majority of the sixteen un-substituted PAHs which may ultimately inhabit the actual source signature authentication. Type and extent of bacterial metabolism, transformation products/substrates, and environmental factors such as: salinity, pH, oxygen concentration, nutrients, light intensity, temperature, co-substrates and environmental medium are hereby recommended as factors to be considered when evaluating possible sources of PAHs.Keywords: comparative correlation, kinetically and thermodynamically-favored PAHs, pearson correlation coefficient, cluster analysis, sources evaluation
Procedia PDF Downloads 4194105 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 127