Search results for: integrated models of reading comprehension
2465 Deep Learning-Based Automated Structure Deterioration Detection for Building Structures: A Technological Advancement for Ensuring Structural Integrity
Authors: Kavita Bodke
Abstract:
Structural health monitoring (SHM) is experiencing growth, necessitating the development of distinct methodologies to address its expanding scope effectively. In this study, we developed automatic structure damage identification, which incorporates three unique types of a building’s structural integrity. The first pertains to the presence of fractures within the structure, the second relates to the issue of dampness within the structure, and the third involves corrosion inside the structure. This study employs image classification techniques to discern between intact and impaired structures within structural data. The aim of this research is to find automatic damage detection with the probability of each damage class being present in one image. Based on this probability, we know which class has a higher probability or is more affected than the other classes. Utilizing photographs captured by a mobile camera serves as the input for an image classification system. Image classification was employed in our study to perform multi-class and multi-label classification. The objective was to categorize structural data based on the presence of cracks, moisture, and corrosion. In the context of multi-class image classification, our study employed three distinct methodologies: Random Forest, Multilayer Perceptron, and CNN. For the task of multi-label image classification, the models employed were Rasnet, Xceptionet, and Inception.Keywords: SHM, CNN, deep learning, multi-class classification, multi-label classification
Procedia PDF Downloads 362464 Determinants of Customer Value in Online Retail Platforms
Authors: Mikko Hänninen
Abstract:
This paper explores the effect online retail platforms have on customer behavior and retail patronage through an inductive multi-case study. Existing research on retail platforms and ecosystems generally focus on competition between platform members and most papers maintain a managerial perspective with customers seen mainly as merely one stakeholder of the value-exchange relationship. It is proposed that retail platforms change the nature of customer relationships compared to traditional brick-and-mortar or e-commerce retailers. With online retail platforms such as Alibaba, Amazon and Rakuten gaining increasing traction with their platform based business models, the purpose of this paper is to define retail platforms and look at how leading retail platforms are able to create value for their customers, in order to foster meaningful customer’ relationships. An analysis is conducted on the major global retail platforms with a focus specifically on understanding the tools in place for creating customer value in order to show how retail platforms create and maintain customer relationships for fostering customer loyalty. The results describe the opportunities and challenges retailers face when competing against platform based businesses and outline the advantages as well as disadvantages that platforms bring to individual consumers. Based on the inductive case research approach, five theoretical propositions on consumer behavior in online retail platforms are developed that also form the basis of further research with this research making both a practical as well as theoretical contribution to platform research streams.Keywords: retail, platform, ecosystem, e-commerce, loyalty
Procedia PDF Downloads 2832463 Long Run Estimates of Population, Consumption and Economic Development of India: An ARDL Bounds Testing Approach of Cointegration
Authors: Sanjay Kumar, Arumugam Sankaran, Arjun K., Mousumi Das
Abstract:
The amount of domestic consumption and population growth is having a positive impact on economic growth and development as observed by the Harrod-Domar and endogenous growth models. The paper negates the Solow growth model which argues the population growth has a detrimental impact on per capita and steady-state growth. Unlike the Solow model, the paper observes, the per capita income growth never falls zero, and it sustains as positive. Hence, our goal here is to investigate the relationship among population, domestic consumption and economic growth of India. For this estimation, annual data from 1980-2016 has been collected from World Development Indicator and Reserve Bank of India. To know the long run as well as short-run dynamics among the variables, we have employed the ARDL bounds testing approach of cointegration followed by modified Wald causality test to know the direction of causality. The conclusion from cointegration and ARDL estimates reveal that there is a long run positive and statistically significant relationship among the variables under study. At the same time, the causality test shows that there is a causal relationship that exists among the variables. Hence, this calls for policies which have a long run perspective in strengthening the capabilities and entitlements of people and stabilizing domestic demand so as to serve long run and short run growth and stability of the economy.Keywords: cointegration, consumption, economic development, population growth
Procedia PDF Downloads 1592462 Radio Frequency Heating of Iron-Filled Carbon Nanotubes for Cancer Treatment
Authors: L. Szymanski, S. Wiak, Z. Kolacinski, G. Raniszewski, L. Pietrzak, Z. Staniszewska
Abstract:
There exist more than one hundred different types of cancer, and therefore no particular treatment is offered to people struggling with this disease. The character of treatment proposed to a patient will depend on a variety of factors such as type of the cancer diagnosed, advancement of the disease, its location in the body, as well as personal preferences of a patient. None of the commonly known methods of cancer-fighting is recognised as a perfect cure, however great advances in this field have been made over last few decades. Once a patient is diagnosed with cancer, he is in need of medical care and professional treatment for upcoming months, and in most cases even for years. Among the principal modes of treatment offered by medical centres, one can find radiotherapy, chemotherapy, and surgery. All of them can be applied separately or in combination, and the relative contribution of each is usually determined by medical specialist in agreement with a patient. In addition to the conventional treatment option, every day more complementary and alternative therapies are integrated into mainstream care. There is one promising cancer modality - hyperthermia therapy which is based on exposing body tissues to high temperatures. This treatment is still being investigated and is not widely available in hospitals and oncological centres. There are two kinds of hyperthermia therapies with direct and indirect heating. The first is not commonly used due to low efficiency and invasiveness, while the second is deeply investigated and a variety of methods have been developed, including ultrasounds, infrared sauna, induction heating and magnetic hyperthermia. The aim of this work was to examine possibilities of heating magnetic nanoparticles under the influence of electromagnetic field for cancer treatment. For this purpose, multiwalled carbon nanotubes used as nanocarriers for iron particles were investigated for its heating properties. The samples were subjected to an alternating electromagnetic field with frequency range between 110-619 kHz. Moreover, samples with various concentrations of carbon nanotubes were examined. The lowest frequency of 110 kHz and sample containing 10 wt% of carbon nanotubes occurred to influence the most effective heating process. Description of hyperthermia therapy aiming at enhancing currently available cancer treatment was also presented in this paper. Most widely applied conventional cancer modalities such as radiation or chemotherapy were also described. Methods for overcoming the most common obstacles in conventional cancer modalities, such as invasiveness and lack of selectivity, has been presented in magnetic hyperthermia characteristics, which explained the increasing interest of the treatment.Keywords: hyperthermia, carbon nanotubes, cancer colon cells, ligands
Procedia PDF Downloads 2662461 Institutional Legitimacy and Professional Boundary: Western Medicine-Trained Doctors' Attitudes and Behaviors toward Traditional Chinese Medicine
Authors: Xiaoli Tian
Abstract:
The recent growing interest in and use of complementary and alternative medicine is a global phenomenon. In many regions, traditional Chinese medicine (TCM), an important type of complementary and alternative medicine, has been formally integrated into the healthcare system. Consequently, today’s doctors face increasing requests and questions from patients regarding TCM. However, studies of TCM focus either on patients’ approaches to TCM and Western medicine (WM) or on the politics involved in the institutionalization of TCM. To our knowledge, sociological studies on doctors’ attitudes toward TCM are rare. This paper compares the receptivity of WM-trained Chinese doctors to TCM in Hong Kong and mainland China, in order to evaluate the interplay between professional training and dominant medical paradigms, on the one hand, and institutional legitimacy and government and client pressures to accept TCM, on the other. Based on survey and in-depth interviews with Western-medicine doctors in Hong Kong and mainland China, this research finds that: there is major difference between Western-medicine doctors’ attitude toward traditional Chinese medicine (TCM) in Hong Kong and mainland China. Doctors in Hong Kong are still suspicious toward TCM, no matter if they have exposure to TCM or not. Even some doctors who have much knowledge about TCM, such as got a diploma or certificate in TCM or tried TCM themselves, are still suspicious. This is because they hold up to the ideal of 'evidence-based medicine' and emphasize the kind of evidence based on randomized controlled trial (RCT). To Western medicine doctors in Hong Kong, this is the most reliable type of evidence for any medical practice, but it is lacking in TCM. This is the major reason why they do not trust TCM and would not refer patients to TCM in clinical practices. In contrast, western medicine doctors in mainland China also know about randomized controlled trial (RCT) and believe that’s the most reliable evidence, but they tend to think experience-based evidence is also reliable. On this basis, they think TCM also has clinical effectiveness. Research findings reveal that legitimacy based on institutional arrangements is a relevant factor, but how doctors understand their professional boundaries also play an important role. Doctors in Hong Kong are more serious about a strict professional boundary between Western medicine and TCM because they benefited from it, such as a very prestigious status and high income. Doctors in mainland China tend to be flexible about professional boundaries because they never benefited from a well-defined strict professional boundary. This is related to a long history of the lack of professionalism in China but is also aggravated by the increasing state support of TCM.Keywords: evidence-based decision-making, institutional legitimacy, professional behavior, traditional Chinese medicine
Procedia PDF Downloads 1842460 Asynchronous Low Duty Cycle Media Access Control Protocol for Body Area Wireless Sensor Networks
Authors: Yasin Ghasemi-Zadeh, Yousef Kavian
Abstract:
Wireless body area networks (WBANs) technology has achieved lots of popularity over the last decade with a wide range of medical applications. This paper presents an asynchronous media access control (MAC) protocol based on B-MAC protocol by giving an application for medical issues. In WBAN applications, there are some serious problems such as energy, latency, link reliability (quality of wireless link) and throughput which are mainly due to size of sensor networks and human body specifications. To overcome these problems and improving link reliability, we concentrated on MAC layer that supports mobility models for medical applications. In the presented protocol, preamble frames are divided into some sub-frames considering the threshold level. Actually, the main reason for creating shorter preambles is the link reliability where due to some reasons such as water, the body signals are affected on some frequency bands and causes fading and shadowing on signals, therefore by increasing the link reliability, these effects are reduced. In case of mobility model, we use MoBAN model and modify that for some more areas. The presented asynchronous MAC protocol is modeled by OMNeT++ simulator. The results demonstrate increasing the link reliability comparing to B-MAC protocol where the packet reception ratio (PRR) is 92% also covers more mobility areas than MoBAN protocol.Keywords: wireless body area networks (WBANs), MAC protocol, link reliability, mobility, biomedical
Procedia PDF Downloads 3692459 Financial Regulation and the Twin Peaks Model in a Developing and Developed Country Contexts: An Institutional Theory Perspective
Authors: Pumela Msweli, Dexter L. Ryneveldt
Abstract:
This paper seeks to shed light on institutional logics and institutionalization processes that influence the successful implementation of financial sector regulations. We use the neo-institutional theory lens to interrogate how the newly promulgated Financial Sector Regulations Act (FSRA) provides for the institutionalisation of the Twin Peaks Model. With the enactment of FSRA, previous financial regulatory institutions were dismantled, and new financial regulators established. In point, the Financial Services Conduct Authority (FSCA) replaced the Financial Services Board (FSB), and accordingly, the Prudential Authority (PA) was established. FSRA is layered with complexities that make it mandatory to co-exist, cooperate, and collaborate with other institutions to fulfill FSRA’s overall financial stability objective. We use content analysis of the financial regulations that established the Twin Peaks Models (TPM) in South Africa and in the Netherlands, to map out the three-stage institutionalization processes: (1) habitualisation, (2) objectification and (3) sedimentation. This allowed for a comparison of how South Africa, as a developing country and Netherlands as a developed country, have institutionalized the Twin Peak model. We provide valuable insights into how differences in the institutional and societal logics of the developing and developed contexts shape the institutionalization of financial regulations.Keywords: financial industry, financial regulation, financial stability, institutionalisation, habitualization, objectification, sedimentation, twin peaks model
Procedia PDF Downloads 1592458 Building a Model for Information Literacy Education in School Settings
Authors: Tibor Koltay
Abstract:
Among varied new literacies, information literacy is not only the best-known one but displays numerous models and frameworks. Nonetheless, there is still a lack of its complex theoretical model that could be applied to information literacy education in public (K12) education, which often makes use of constructivist approaches. This paper aims to present the main features of such a model. To develop a complex model, the literature and practice of phenomenographic and sociocultural theories, as well as discourse analytical approaches to information literacy, have been reviewed. Besides these constructivist and expressive based educational approaches, the new model is intended to include the innovation of coupling them with a cognitive model that takes developing informational and operational knowledge into account. The convergences between different literacies (information literacy, media literacy, media and information literacy, and data literacy) were taken into account, as well. The model will also make use of a three-country survey that examined secondary school teachers’ attitudes to information literacy. The results of this survey show that only a part of the respondents feel properly prepared to teach information literacy courses, and think that they can teach information literacy skills by themselves, while they see a librarian as an expert in educating information literacy. The use of the resulting model is not restricted to enhancing theory. It is meant to raise the level of awareness about information literacy and related literacies, and the next phase of the model’s development will be a pilot study that verifies the usefulness of the methodology for practical information literacy education in selected Hungarian secondary schools.Keywords: communication, data literacy, discourse analysis, information literacy education, media and information literacy media literacy, phenomenography, public education, sociocultural theory
Procedia PDF Downloads 1472457 Socioeconomic Status and Mortality in Older People with Angina: A Population-Based Cohort Study in China
Authors: Weiju Zhou, Alex Hopkins, Ruoling Chen
Abstract:
Background: China has increased the gap in income between richer and poorer over the past 40 years, and the number of deaths from people with angina has been rising. It is unclear whether socioeconomic status (SES) is associated with increased mortality in older people with angina. Methods: Data from a cohort study of 2,380 participants aged ≥ 65 years, who were randomly recruited from 5-province urban communities were examined in China. The cohort members were interviewed to record socio-demographic and risk factors and document doctor-diagnosed angina at baseline and were followed them up in 3-10 years, including monitoring vital status. Multivariate Cox regression models were employed to examine all-cause mortality in relation to low SES. Results: The cohort follow-up identified 373 deaths occurred; 41 deaths in 208 angina patients. Compared to participants without angina (n=2,172), patients with angina had increased mortality (multivariate adjusted hazard ratio (HR) was 1.41, 95% CI 1.01-1.97). Within angina patients, the risk of mortality increased with low satisfactory income (2.51, 1.08-5.85) and having financial problem (4.00, 1.07-15.00), but significantly with levels of education and occupation. In non-angina participants, none of these four SES indicators were associated with mortality. There was a significant interaction effect between angina and low satisfactory income on mortality. Conclusions: In China, having low income and financial problem increase mortality in older people with angina. Strategies to improve economic circumstances in older people could help reduce inequality in angina survival.Keywords: angina, mortality, older people, socio-economic status
Procedia PDF Downloads 1182456 Learner's Difficulties Acquiring English: The Case of Native Speakers of Rio de La Plata Spanish Towards Justifying the Need for Corpora
Authors: Maria Zinnia Bardas Hoffmann
Abstract:
Contrastive Analysis (CA) is the systematic comparison between two languages. It stems from the notion that errors are caused by interference of the L1 system in the acquisition process of an L2. CA represents a useful tool to understand the nature of learning and acquisition. Also, this particular method promises a path to un-derstand the nature of underlying cognitive processes, even when other factors such as intrinsic motivation and teaching strategies were found to best explain student’s problems in acquisition. CA study is justified not only from the need to get a deeper understanding of the nature of SLA, but as an invaluable source to provide clues, at a cognitive level, for those general processes involved in rule formation and abstract thought. It is relevant for cross disciplinary studies and the fields of Computational Thought, Natural Language processing, Applied Linguistics, Cognitive Linguistics and Math Theory. That being said, this paper intends to address here as well its own set of constraints and limitations. Finally, this paper: (a) aims at identifying some of the difficulties students may find in their learning process due to the nature of their specific variety of L1, Rio de la Plata Spanish (RPS), (b) represents an attempt to discuss the necessity for specific models to approach CA.Keywords: second language acquisition, applied linguistics, contrastive analysis, applied contrastive analysis English language department, meta-linguistic rules, cross-linguistics studies, computational thought, natural language processing
Procedia PDF Downloads 1502455 Evaluation of Ceres Wheat and Rice Model for Climatic Conditions in Haryana, India
Authors: Mamta Rana, K. K. Singh, Nisha Kumari
Abstract:
The simulation models with its soil-weather-plant atmosphere interacting system are important tools for assessing the crops in changing climate conditions. The CERES-Wheat & Rice vs. 4.6 DSSAT was calibrated and evaluated for one of the major producers of wheat and rice state- Haryana, India. The simulation runs were made under irrigated conditions and three fertilizer applications dose of N-P-K to estimate crop yield and other growth parameters along with the phenological development of the crop. The genetic coefficients derived by iteratively manipulating the relevant coefficients that characterize the phenological process of wheat and rice crop to the best fit match between the simulated and observed anthesis, physological maturity and final grain yield. The model validated by plotting the simulated and remote sensing derived LAI. LAI product from remote sensing provides the edge of spatial, timely and accurate assessment of crop. For validating the yield and yield components, the error percentage between the observed and simulated data was calculated. The analysis shows that the model can be used to simulate crop yield and yield components for wheat and rice cultivar under different management practices. During the validation, the error percentage was less than 10%, indicating the utility of the calibrated model for climate risk assessment in the selected region.Keywords: simulation model, CERES-wheat and rice model, crop yield, genetic coefficient
Procedia PDF Downloads 3052454 Sedimentary, Diagenesis and Evaluation of High Quality Reservoir of Coarse Clastic Rocks in Nearshore Deep Waters in the Dongying Sag; Bohai Bay Basin
Authors: Kouassi Louis Kra
Abstract:
The nearshore deep-water gravity flow deposits in the Northern steep slope of Dongying depression, Bohai Bay basin, have been acknowledged as important reservoirs in the rift lacustrine basin. These deep strata term as coarse clastic sediment, deposit at the root of the slope have complex depositional processes and involve wide diagenetic events which made high-quality reservoir prediction to be complex. Based on the integrated study of seismic interpretation, sedimentary analysis, petrography, cores samples, wireline logging data, 3D seismic and lithological data, the reservoir formation mechanism deciphered. The Geoframe software was used to analyze 3-D seismic data to interpret the stratigraphy and build a sequence stratigraphic framework. Thin section identification, point counts were performed to assess the reservoir characteristics. The software PetroMod 1D of Schlumberger was utilized for the simulation of burial history. CL and SEM analysis were performed to reveal diagenesis sequences. Backscattered electron (BSE) images were recorded for definition of the textural relationships between diagenetic phases. The result showed that the nearshore steep slope deposits mainly consist of conglomerate, gravel sandstone, pebbly sandstone and fine sandstone interbedded with mudstone. The reservoir is characterized by low-porosity and ultra-low permeability. The diagenesis reactions include compaction, precipitation of calcite, dolomite, kaolinite, quartz cement and dissolution of feldspars and rock fragment. The main types of reservoir space are primary intergranular pores, residual intergranular pores, intergranular dissolved pores, intergranular dissolved pores, and fractures. There are three obvious anomalous high-porosity zones in the reservoir. Overpressure and early hydrocarbon filling are the main reason for abnormal secondary pores development. Sedimentary facies control the formation of high-quality reservoir, oil and gas filling preserves secondary pores from late carbonate cementation.Keywords: Bohai Bay, Dongying Sag, deep strata, formation mechanism, high-quality reservoir
Procedia PDF Downloads 1352453 Assessment of Adsorption Properties of Neem Leaves Wastes for the Removal of Congo Red and Methyl Orange
Authors: Muhammad B. Ibrahim, Muhammad S. Sulaiman, Sadiq Sani
Abstract:
Neem leaves were studied as plant wastes derived adsorbents for detoxification of Congo Red (CR) and Methyl Orange (MO) from aqueous solutions using batch adsorption technique. The objectives involved determining the effects of the basic adsorption parameters are namely, agitation time, adsorbent dosage, adsorbents particle size, adsorbate loading concentrations and initial pH, on the adsorption process as well as characterizing the adsorbents by determining their physicochemical properties, functional groups responsible for the adsorption process using Fourier Transform Infrared (FTIR) spectroscopy and surface morphology using scanning electron microscopy (SEM) coupled with energy dispersion X – ray spectroscopy (EDS). The adsorption behaviours of the materials were tested against Langmuir, Freundlich, etc. isotherm models. Percent adsorption increased with increase in agitation time (5 – 240 minutes), adsorbent dosage (100-500mg), initial concentration (100-300mg/L), and with decrease in particle size (≥75μm to ≤300μm) of the adsorbents. Both processes are dye pH-dependent, increasing or decreasing percent adsorption in acidic (2-6) or alkaline (8-12) range over the studied pH (2-12) range. From the experimental data the Langmuir’s separation factor (RL) suggests unfavourable adsorption for all processes, Freundlich constant (nF) indicates unfavourable process for CR and MO adsorption; while the mean free energy of adsorptionKeywords: adsorption, congo red, methyl orange, neem leave
Procedia PDF Downloads 3652452 Analysis of Ozone Episodes in the Forest and Vegetation Areas with Using HYSPLIT Model: A Case Study of the North-West Side of Biga Peninsula, Turkey
Authors: Deniz Sari, Selahattin İncecik, Nesimi Ozkurt
Abstract:
Surface ozone, which named as one of the most critical pollutants in the 21th century, threats to human health, forest and vegetation. Specifically, in rural areas surface ozone cause significant influences on agricultural productions and trees. In this study, in order to understand to the surface ozone levels in rural areas we focus on the north-western side of Biga Peninsula which covers by the mountainous and forested area. Ozone concentrations were measured for the first time with passive sampling at 10 sites and two online monitoring stations in this rural area from 2013 and 2015. Using with the daytime hourly O3 measurements during light hours (08:00–20:00) exceeding the threshold of 40 ppb over the 3 months (May, June and July) for agricultural crops, and over the six months (April to September) for forest trees AOT40 (Accumulated hourly O3 concentrations Over a Threshold of 40 ppb) cumulative index was calculated. AOT40 is defined by EU Directive 2008/50/EC to evaluate whether ozone pollution is a risk for vegetation, and is calculated by using hourly ozone concentrations from monitoring systems. In the present study, we performed the trajectory analysis by The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to follow the long-range transport sources contributing to the high ozone levels in the region. The ozone episodes observed between 2013 and 2015 were analysed using the HYSPLIT model developed by the NOAA-ARL. In addition, the cluster analysis is used to identify homogeneous groups of air mass transport patterns can be conducted through air trajectory clustering by grouping similar trajectories in terms of air mass movement. Backward trajectories produced for 3 years by HYSPLIT model were assigned to different clusters according to their moving speed and direction using a k-means clustering algorithm. According to cluster analysis results, northerly flows to study area cause to high ozone levels in the region. The results present that the ozone values in the study area are above the critical levels for forest and vegetation based on EU Directive 2008/50/EC.Keywords: AOT40, Biga Peninsula, HYSPLIT, surface ozone
Procedia PDF Downloads 2552451 Laser-Dicing Modeling: Implementation of a High Accuracy Tool for Laser-Grooving and Cutting Application
Authors: Jeff Moussodji, Dominique Drouin
Abstract:
The highly complex technology requirements of today’s integrated circuits (ICs), lead to the increased use of several materials types such as metal structures, brittle and porous low-k materials which are used in both front end of line (FEOL) and back end of line (BEOL) process for wafer manufacturing. In order to singulate chip from wafer, a critical laser-grooving process, prior to blade dicing, is used to remove these layers of materials out of the dicing street. The combination of laser-grooving and blade dicing allows to reduce the potential risk of induced mechanical defects such micro-cracks, chipping, on the wafer top surface where circuitry is located. It seems, therefore, essential to have a fundamental understanding of the physics involving laser-dicing in order to maximize control of these critical process and reduce their undesirable effects on process efficiency, quality, and reliability. In this paper, the study was based on the convergence of two approaches, numerical and experimental studies which allowed us to investigate the interaction of a nanosecond pulsed laser and BEOL wafer materials. To evaluate this interaction, several laser grooved samples were compared with finite element modeling, in which three different aspects; phase change, thermo-mechanical and optic sensitive parameters were considered. The mathematical model makes it possible to highlight a groove profile (depth, width, etc.) of a single pulse or multi-pulses on BEOL wafer material. Moreover, the heat affected zone, and thermo-mechanical stress can be also predicted as a function of laser operating parameters (power, frequency, spot size, defocus, speed, etc.). After modeling validation and calibration, a satisfying correlation between experiment and modeling, results have been observed in terms of groove depth, width and heat affected zone. The study proposed in this work is a first step toward implementing a quick assessment tool for design and debug of multiple laser grooving conditions with limited experiments on hardware in industrial application. More correlations and validation tests are in progress and will be included in the full paper.Keywords: laser-dicing, nano-second pulsed laser, wafer multi-stack, multiphysics modeling
Procedia PDF Downloads 2092450 Multivariate Rainfall Disaggregation Using MuDRain Model: Malaysia Experience
Authors: Ibrahim Suliman Hanaish
Abstract:
Disaggregation daily rainfall using stochastic models formulated based on multivariate approach (MuDRain) is discussed in this paper. Seven rain gauge stations are considered in this study for different distances from the referred station starting from 4 km to 160 km in Peninsular Malaysia. The hourly rainfall data used are covered the period from 1973 to 2008 and July and November months are considered as an example of dry and wet periods. The cross-correlation among the rain gauges is considered for the available hourly rainfall information at the neighboring stations or not. This paper discussed the applicability of the MuDRain model for disaggregation daily rainfall to hourly rainfall for both sources of cross-correlation. The goodness of fit of the model was based on the reproduction of fitting statistics like the means, variances, coefficients of skewness, lag zero cross-correlation of coefficients and the lag one auto correlation of coefficients. It is found the correlation coefficients based on extracted correlations that was based on daily are slightly higher than correlations based on available hourly rainfall especially for neighboring stations not more than 28 km. The results showed also the MuDRain model did not reproduce statistics very well. In addition, a bad reproduction of the actual hyetographs comparing to the synthetic hourly rainfall data. Mean while, it is showed a good fit between the distribution function of the historical and synthetic hourly rainfall. These discrepancies are unavoidable because of the lowest cross correlation of hourly rainfall. The overall performance indicated that the MuDRain model would not be appropriate choice for disaggregation daily rainfall.Keywords: rainfall disaggregation, multivariate disaggregation rainfall model, correlation, stochastic model
Procedia PDF Downloads 5162449 Probabilistic Slope Stability Analysis of Excavation Induced Landslides Using Hermite Polynomial Chaos
Authors: Schadrack Mwizerwa
Abstract:
The characterization and prediction of landslides are crucial for assessing geological hazards and mitigating risks to infrastructure and communities. This research aims to develop a probabilistic framework for analyzing excavation-induced landslides, which is fundamental for assessing geological hazards and mitigating risks to infrastructure and communities. The study uses Hermite polynomial chaos, a non-stationary random process, to analyze the stability of a slope and characterize the failure probability of a real landslide induced by highway construction excavation. The correlation within the data is captured using the Karhunen-Loève (KL) expansion theory, and the finite element method is used to analyze the slope's stability. The research contributes to the field of landslide characterization by employing advanced random field approaches, providing valuable insights into the complex nature of landslide behavior and the effectiveness of advanced probabilistic models for risk assessment and management. The data collected from the Baiyuzui landslide, induced by highway construction, is used as an illustrative example. The findings highlight the importance of considering the probabilistic nature of landslides and provide valuable insights into the complex behavior of such hazards.Keywords: Hermite polynomial chaos, Karhunen-Loeve, slope stability, probabilistic analysis
Procedia PDF Downloads 762448 Passive and Active Spatial Pendulum Tuned Mass Damper with Two Tuning Frequencies
Authors: W. T. A. Mohammed, M. Eltaeb, R. Kashani
Abstract:
The first bending modes of tall asymmetric structures in the two lateral X and Y-directions have two different natural frequencies. To add tuned damping to these bending modes, one needs to either a) use two pendulum-tuned mass dampers (PTMDs) with one tuning frequency, each PTMD targeting one of the bending modes, or b) use one PTMD with two tuning frequencies (one in each lateral directions). Option (a), being more massive, requiring more space, and being more expensive, is less attractive than option (b). Considering that the tuning frequency of a pendulum depends mainly on the pendulum length, one way of realizing option (b) is by constraining the swinging length of the pendulum in one direction but not in the other; such PTMD is dubbed passive Bi-PTMD. Alternatively, option (b) can be realized by actively setting the tuning frequencies of the PTMD in the two directions. In this work, accurate physical models of passive Bi-PTMD and active PTMD are developed and incorporated into the numerical model of a tall asymmetric structure. The model of PTMDs plus structure is used for a)synthesizing such PTMDs for particular applications and b)evaluating their damping effectiveness in mitigating the dynamic lateral responses of their target asymmetric structures, perturbed by wind load in X and Y-directions. Depending on how elaborate the control scheme is, the active PTMD can either be made to yield the same damping effectiveness as the passive Bi-PTMD of the same size or the passive Bi-TMD twice as massive as the active PTMD.Keywords: active tuned mass damper, high-rise building, multi-frequency tuning, vibration control
Procedia PDF Downloads 1052447 Integrating Virtual Reality and Building Information Model-Based Quantity Takeoffs for Supporting Construction Management
Authors: Chin-Yu Lin, Kun-Chi Wang, Shih-Hsu Wang, Wei-Chih Wang
Abstract:
A construction superintendent needs to know not only the amount of quantities of cost items or materials completed to develop a daily report or calculate the daily progress (earned value) in each day, but also the amount of quantities of materials (e.g., reinforced steel and concrete) to be ordered (or moved into the jobsite) for performing the in-progress or ready-to-start construction activities (e.g., erection of reinforced steel and concrete pouring). These daily construction management tasks require great effort in extracting accurate quantities in a short time (usually must be completed right before getting off work every day). As a result, most superintendents can only provide these quantity data based on either what they see on the site (high inaccuracy) or the extraction of quantities from two-dimension (2D) construction drawings (high time consumption). Hence, the current practice of providing the amount of quantity data completed in each day needs improvement in terms of more accuracy and efficiency. Recently, a three-dimension (3D)-based building information model (BIM) technique has been widely applied to support construction quantity takeoffs (QTO) process. The capability of virtual reality (VR) allows to view a building from the first person's viewpoint. Thus, this study proposes an innovative system by integrating VR (using 'Unity') and BIM (using 'Revit') to extract quantities to support the above daily construction management tasks. The use of VR allows a system user to be present in a virtual building to more objectively assess the construction progress in the office. This VR- and BIM-based system is also facilitated by an integrated database (consisting of the information and data associated with the BIM model, QTO, and costs). In each day, a superintendent can work through a BIM-based virtual building to quickly identify (via a developed VR shooting function) the building components (or objects) that are in-progress or finished in the jobsite. And he then specifies a percentage (e.g., 20%, 50% or 100%) of completion of each identified building object based on his observation on the jobsite. Next, the system will generate the completed quantities that day by multiplying the specified percentage by the full quantities of the cost items (or materials) associated with the identified object. A building construction project located in northern Taiwan is used as a case study to test the benefits (i.e., accuracy and efficiency) of the proposed system in quantity extraction for supporting the development of daily reports and the orders of construction materials.Keywords: building information model, construction management, quantity takeoffs, virtual reality
Procedia PDF Downloads 1322446 Microscopic Simulation of Toll Plaza Safety and Operations
Authors: Bekir O. Bartin, Kaan Ozbay, Sandeep Mudigonda, Hong Yang
Abstract:
The use of microscopic traffic simulation in evaluating the operational and safety conditions at toll plazas is demonstrated. Two toll plazas in New Jersey are selected as case studies and were developed and validated in Paramics traffic simulation software. In order to simulate drivers’ lane selection behavior in Paramics, a utility-based lane selection approach is implemented in Paramics Application Programming Interface (API). For each vehicle approaching the toll plaza, a utility value is assigned to each toll lane by taking into account the factors that are likely to impact drivers’ lane selection behavior, such as approach lane, exit lane and queue lengths. The results demonstrate that similar operational conditions, such as lane-by-lane toll plaza traffic volume can be attained using this approach. In addition, assessment of safety at toll plazas is conducted via a surrogate safety measure. In particular, the crash index (CI), an improved surrogate measure of time-to-collision (TTC), which reflects the severity of a crash is used in the simulation analyses. The results indicate that the spatial and temporal frequency of observed crashes can be simulated using the proposed methodology. Further analyses can be conducted to evaluate and compare various different operational decisions and safety measures using microscopic simulation models.Keywords: microscopic simulation, toll plaza, surrogate safety, application programming interface
Procedia PDF Downloads 1832445 Investigating the Relationship Between Corporate Governance and Financial Performance Considering the Moderating Role of Opinion and Internal Control Weakness
Authors: Fatemeh Norouzi
Abstract:
Today, financial performance has become one of the important issues in accounting and auditing that companies and their managers have paid attention to this issue and for this reason to the variables that are influential in this field. One of the things that can affect financial performance is corporate governance, which is examined in this research, although some things such as issues related to auditing can also moderate this relationship; Therefore, this research has been conducted with the aim of investigating the relationship between corporate governance and financial performance with regard to the moderating role of feedback and internal control weakness. The research is practical in terms of purpose, and in terms of method, it has been done in a post-event descriptive manner, in which the data has been analyzed using stock market data. Data collection has been done by using stock exchange data which has been extracted from the website of the Iraqi Stock Exchange, the statistical population of this research is all the companies admitted to the Iraqi Stock Exchange. . The statistical sample in this research is considered from 2014 to 2021, which includes 34 companies. Four different models have been considered for the research hypotheses, which are eight hypotheses, in this research, the analysis has been done using EXCEL and STATA15 software. In this article, collinearity test, integration test ,determination of fixed effects and correlation matrix results, have been used. The research results showed that the first four hypotheses were rejected and the second four hypotheses were confirmed.Keywords: size of the board of directors, duality of the CEO, financial performance, internal control weakness
Procedia PDF Downloads 882444 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine
Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li
Abstract:
Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation
Procedia PDF Downloads 2352443 Modern Scotland Yard: Improving Surveillance Policies Using Adversarial Agent-Based Modelling and Reinforcement Learning
Authors: Olaf Visker, Arnout De Vries, Lambert Schomaker
Abstract:
Predictive policing refers to the usage of analytical techniques to identify potential criminal activity. It has been widely implemented by various police departments. Being a relatively new area of research, there are, to the author’s knowledge, no absolute tried, and true methods and they still exhibit a variety of potential problems. One of those problems is closely related to the lack of understanding of how acting on these prediction influence crime itself. The goal of law enforcement is ultimately crime reduction. As such, a policy needs to be established that best facilitates this goal. This research aims to find such a policy by using adversarial agent-based modeling in combination with modern reinforcement learning techniques. It is presented here that a baseline model for both law enforcement and criminal agents and compare their performance to their respective reinforcement models. The experiments show that our smart law enforcement model is capable of reducing crime by making more deliberate choices regarding the locations of potential criminal activity. Furthermore, it is shown that the smart criminal model presents behavior consistent with popular crime theories and outperforms the baseline model in terms of crimes committed and time to capture. It does, however, still suffer from the difficulties of capturing long term rewards and learning how to handle multiple opposing goals.Keywords: adversarial, agent based modelling, predictive policing, reinforcement learning
Procedia PDF Downloads 1482442 A Model of Empowerment Evaluation of Knowledge Management in Private Banks Using Fuzzy Inference System
Authors: Nazanin Pilevari, Kamyar Mahmoodi
Abstract:
The purpose of this research is to provide a model based on fuzzy inference system for evaluating empowerment of Knowledge management. The first prototype of the research was developed based on the study of literature. In the next step, experts were provided with these models and after implementing consensus-based reform, the views of Fuzzy Delphi experts and techniques, components and Index research model were finalized. Culture, structure, IT and leadership were considered as dimensions of empowerment. Then, In order to collect and extract data for fuzzy inference system based on knowledge and Experience, the experts were interviewed. The values obtained from designed fuzzy inference system, made review and assessment of the organization's empowerment of Knowledge management possible. After the design and validation of systems to measure indexes ,empowerment of Knowledge management and inputs into fuzzy inference) in the AYANDEH Bank, a questionnaire was used. In the case of this bank, the system output indicates that the status of empowerment of Knowledge management, culture, organizational structure and leadership are at the moderate level and information technology empowerment are relatively high. Based on these results, the status of knowledge management empowerment in AYANDE Bank, was moderate. Eventually, some suggestions for improving the current situation of banks were provided. According to studies of research history, the use of powerful tools in Fuzzy Inference System for assessment of Knowledge management and knowledge management empowerment such an assessment in the field of banking, are the innovation of this Research.Keywords: knowledge management, knowledge management empowerment, fuzzy inference system, fuzzy Delphi
Procedia PDF Downloads 3592441 Urban Forest Innovation Lab as a Driver to Boost Forest Bioeconomy
Authors: Carmen Avilés Palacios, Camilo Muñoz Arenas, Joaquín García Alfonso, Jesús González Arteaga, Alberto Alcalde Calonge
Abstract:
There is a need for review of the consumption and production models of industrialized states in accordance with the Paris Agreement and the Sustainable Development Goals (1) (OECD, 2016). This constitutes the basis of the bioeconomy (2) that is focused on striking a balance between economic development, social development and environmental protection. Bioeconomy promotes the adequate use and consumption of renewable natural resources (3) and involves developing new products and services adapted to the principles of circular economy: more sustainable (reusable, biodegradable, renewable and recyclable) and with a lower carbon footprint (4). In this context, Urban Forest Innovation Lab (UFIL) grows, an Urban Laboratory for experimentation focused on promoting entrepreneurship in forest bioeconomy (www.uiacuenca.es). UFIL generates local wellness taking sustainable advantage of an endogenous asset, forests. UFIL boosts forest bioeconomy opening its doors of knowledge to pioneers in this field, giving the opportunity to be an active part of a change in the way of understanding the exploitation of natural resources, discovering business, learning strategies and techniques and incubating business ideas So far now, 100 entrepreneurs are incubating their nearly 30 new business plans. UFIL has promoted an ecosystem to connect the rural-urban world that promotes sustainable rural development around the forest.Keywords: bioeconomy, forestry, innovation, entrepreneurship
Procedia PDF Downloads 1162440 Breast Cancer Survivability Prediction via Classifier Ensemble
Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia
Abstract:
This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.Keywords: classifier ensemble, breast cancer survivability, data mining, SEER
Procedia PDF Downloads 3282439 Assessment Power and Oscillation Damping Using the POD Controller and Proposed FOD Controller
Authors: Tohid Rahimi, Yahya Naderi, Babak Yousefi, Seyed Hossein Hoseini
Abstract:
Today’s modern interconnected power system is highly complex in nature. In this, one of the most important requirements during the operation of the electric power system is the reliability and security. Power and frequency oscillation damping mechanism improve the reliability. Because of power system stabilizer (PSS) low speed response against of major fault such as three phase short circuit, FACTs devise that can control the network condition in very fast time, are becoming popular. However, FACTs capability can be seen in a major fault present when nonlinear models of FACTs devise and power system equipment are applied. To realize this aim, the model of multi-machine power system with FACTs controller is developed in MATLAB/SIMULINK using Sim Power System (SPS) blockiest. Among the FACTs device, Static synchronous series compensator (SSSC) due to high speed changes its reactance characteristic inductive to capacitive, is effective power flow controller. Tuning process of controller parameter can be performed using different method. However, Genetic Algorithm (GA) ability tends to use it in controller parameter tuning process. In this paper, firstly POD controller is used to power oscillation damping. But in this station, frequency oscillation dos not has proper damping situation. Therefore, FOD controller that is tuned using GA is using that cause to damp out frequency oscillation properly and power oscillation damping has suitable situation.Keywords: power oscillation damping (POD), frequency oscillation damping (FOD), Static synchronous series compensator (SSSC), Genetic Algorithm (GA)
Procedia PDF Downloads 4762438 Investigation of the Historical Background of Monumental Mosques in Kocaeli, Turkey by IRT Techniques
Authors: Emre Kishalı, Neslihan TürkmenoğLu Bayraktar
Abstract:
Historical buildings may face various impacts throughout their life cycle. There have been environmental, structural, public works actions on old monuments influencing sustainability and maintenance issues. As a result, ancient monuments can have been undergone various changes in the context of restoration and repair. Currently, these buildings face integrated conditions including city planning macro solutions, old intervention methods, modifications in building envelope and artefacts in terms of conservation. Moreover, documentation of phases is an essential for assessing the historical building, yet it can result in highly complicated and interwoven issues. Herein, two monuments constructed in the 16th century are selected as case studies in Kocaeli, Turkey which are located in different micro climatic conditions and/or exposed to different interventions and which are important for the city as cultural property. Pertev Paşa Mosque (also known as Yenicuma Mosque) -constructed by Architect Sinan-; Gebze Çoban Mustafa Paşa Mosque -constructed in 1523 and known as the work of Architect Sinan but various names asserted as the architect of building according to resources. Active water infiltration and damages, recent material interventions, hidden niches, and foundation techniques of the mosque are investigated via Infrared Thermography under the project of 114K284, “Non-Destructive Test Applications, in the Context of Planned Conservation, through Historical Mosques of Kocaeli: Coban Mustafa Pasa Mosque, Fevziye Mosque and Pertev Pasa Mosque” funded by TUBITAK. It is aimed to reveal active deteriorations on building elements generated by unwanted effects of structural and climatic conditions, historical interventions, and modifications by monitoring the variation of surface temperature and humidity by IRT visualization method which is an important non- destructive process for investigation of monuments in the conservation field in the context of planned conservation. It is also concluded that in-situ monitoring process via IRT through different climatic conditions give substantial information on the behaviour of the envelope to the physical environmental conditions by observation of thermal performance, degradations. However, it is obvious that monitoring of historical buildings cannot be pursued by implementing a single non-destructive technique to have complete data of the structure.Keywords: IRT, non-destructive test, planned conservation, mosque
Procedia PDF Downloads 3522437 State Capacity and the Adoption of Restrictive Asylum Policies in Developing Countries
Authors: Duncan K. Espenshade
Abstract:
Scholars have established expectations regarding how the political and economic interests of a country's people and elites can influence its migration policies. Most of the scholarship exploring the adoption of migration policies focuses on the developed world, focusing on the cultural, political, and economic influences that drive restrictive policies in developed countries. However, despite the scholarly focus on migration policies in developed countries, most internationally displaced people reside in developing countries. Furthermore, while the political and economic factors that influence migration policy in developed countries are likely at play in developing states, developing states also face unique hurdles to policy formation not present in developed states. Namely, this article explores how state capacity, or in this context, a state's de facto ability to restrict or absorb migration inflows, influences the adoption of migration policies in developing countries. Using Cox-Proportional hazard models and recently introduced data on asylum policies in developing countries, this research finds that having a greater ability to restrict migration flows is associated with a reduced likelihood of adopting liberal asylum policies. Future extensions of this project will explore the adoption of asylum policies as a two-stage process, in which the available decision set of political actors is first constrained by a state's restrictive and absorptive capacity in the first stage, with the political, economic, and cultural factors influencing the policy adopted in the second stage.Keywords: state capacity, international relations, foreign policy, migration
Procedia PDF Downloads 1062436 Implications of Climate Change and World Uncertainty for Gender Inequality: Global Evidence
Authors: Kashif Nesar Rather, Mantu Kumar Mahalik
Abstract:
The discourse surrounding climate change has gained considerable traction, with a discernible emphasis on its nuanced and consequential impact on gender inequality. Concurrently, escalating global tensions are contributing to heightened uncertainty, potentially exerting influence on gender disparities. Within this framework, this study attempts to empirically investigate the implications of climate change and world uncertainty on the gender inequality for a balanced panel of 100 economies between 1995 to 2021. The estimated models also control for the effects of globalisation, economic growth, and education expenditure. The panel cointegration tests establish a significant long-run relationship between the variables of the study. Furthermore, the PMG-ARDL (Panel mean group-Autoregressive distributed lag model) estimation technique confirms that both climate change and world uncertainty perpetuate the global gender inequalities. Additionally, the results establish that globalisation, economic growth, and education expenditure exert a mitigating influence on gender inequality, signifying their role in diminishing gender disparities. These findings are further confirmed by the FGLS (Feasible Generalized Least Squares) and DKSE (Driscoll-Kraay Standard Errors) regression methods. Potential policy implications for mitigating the detrimental gender ramifications stemming from climate change and rising world uncertainties are also discussed.Keywords: gender inequality, world uncertainty, climate change, globalisation., ecological footprint
Procedia PDF Downloads 38