Search results for: demand fragility curve
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4299

Search results for: demand fragility curve

3249 Application of Genetic Programming for Evolution of Glass-Forming Ability Parameter

Authors: Manwendra Kumar Tripathi, Subhas Ganguly

Abstract:

A few glass forming ability expressions in terms of characteristic temperatures have been proposed in the literature. Attempts have been made to correlate the expression with the critical diameter of the bulk metallic glass composition. However, with the advent of new alloys, many exceptions have been noted and reported. In the present approach, a genetic programming based code which generates an expression in terms of input variables, i.e., three characteristic temperatures viz. glass transition temperature (Tg), onset crystallization temperature (Tx) and offset temperature of melting (Tl) with maximum correlation with a critical diameter (Dmax). The expression evolved shows improved correlation with the critical diameter. In addition, the expression can be explained on the basis of time-temperature transformation curve.

Keywords: glass forming ability, genetic programming, bulk metallic glass, critical diameter

Procedia PDF Downloads 330
3248 Data about Loggerhead Sea Turtle (Caretta caretta) and Green Turtle (Chelonia mydas) in Vlora Bay, Albania

Authors: Enerit Sacdanaku, Idriz Haxhiu

Abstract:

This study was conducted in the area of Vlora Bay, Albania. Data about Sea Turtles Caretta caretta and Chelonia mydas, belonging to two periods of time (1984–1991; 2008–2014) are given. All data gathered were analyzed using recent methodologies. For all turtles captured (as by catch), the Curve Carapace Length (CCL) and Curved Carapace Width (CCW) were measured. These data were statistically analyzed, where the mean was 67.11 cm for CCL and 57.57 cm for CCW of all individuals studied (n=13). All untagged individuals of marine turtles were tagged using metallic tags (Stockbrand’s titanium tag) with an Albanian address. Sex was determined and resulted that 45.4% of individuals were females, 27.3% males and 27.3% juveniles. All turtles were studied for the presence of the epibionts. The area of Vlora Bay is used from marine turtles (Caretta caretta) as a migratory corridor to pass from the Mediterranean to the northern part of the Adriatic Sea.

Keywords: Caretta caretta, Chelonia mydas, CCL, CCW, tagging, Vlora Bay

Procedia PDF Downloads 174
3247 Thermodynamic Modelling of Liquid-Liquid Equilibria (LLE) in the Separation of p-Cresol from the Coal Tar by Solvent Extraction

Authors: D. S. Fardhyanti, Megawati, W. B. Sediawan

Abstract:

Coal tar is a liquid by-product of the process of coal gasification and carbonation. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in the separation of phenol from the coal tar by solvent extraction. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of p-Cresol mixtures for those system.

Keywords: coal tar, phenol, Wohl, Van Laar, Three-Suffix Margules

Procedia PDF Downloads 253
3246 Modelling Railway Noise Over Large Areas, Assisted by GIS

Authors: Conrad Weber

Abstract:

The modelling of railway noise over large projects areas can be very time consuming in terms of preparing the noise models and calculation time. An open-source GIS program has been utilised to assist with the modelling of operational noise levels for 675km of railway corridor. A range of GIS algorithms were utilised to break up the noise model area into manageable calculation sizes. GIS was utilised to prepare and filter a range of noise modelling inputs, including building files, land uses and ground terrain. A spreadsheet was utilised to manage the accuracy of key input parameters, including train speeds, train types, curve corrections, bridge corrections and engine notch settings. GIS was utilised to present the final noise modelling results. This paper explains the noise modelling process and how the spreadsheet and GIS were utilised to accurately model this massive project efficiently.

Keywords: noise, modeling, GIS, rail

Procedia PDF Downloads 115
3245 Complete Enumeration Approach for Calculation of Residual Entropy for Diluted Spin Ice

Authors: Yuriy A. Shevchenko, Konstantin V. Nefedev

Abstract:

We consider the antiferromagnetic systems of Ising spins located at the sites of the hexagonal, triangular and pyrochlore lattices. Such systems can be diluted to a certain concentration level by randomly replacing the magnetic spins with nonmagnetic ones. Quite recently we studied density of states (DOS) was calculated by the Wang-Landau method. Based on the obtained data, we calculated the dependence of the residual entropy (entropy at a temperature tending to zero) on the dilution concentration for quite large systems (more than 2000 spins). In the current study, we obtained the same data for small systems (less than 20 spins) by a complete search of all possible magnetic configurations and compared the result with the result for large systems. The shape of the curve remains unchanged in both cases, but the specific values of the residual entropy are different because of the finite size effect.

Keywords: entropy, pyrochlore, spin ice, Wang-Landau algorithm

Procedia PDF Downloads 258
3244 Assessment of Water Quality Based on Physico-Chemical and Microbiological Parameters in Batllava Lake, Case Study Kosovo

Authors: Albana Kashtanjeva-Bytyçi, Idriz Vehapi, Rifat Morina, Osman Fetoshi

Abstract:

The purpose of this study is to determine the water quality in Batllava Leka through which a part of the population of the Prishtina region is supplied with drinking water. Batllava Leka is a lake built in the 70s. This lake is located in the village of Btlava in the municipality of Podujeva, with coordinates 42 ° 49′33 ″ V 21 ° 18′25 ″ L, with an area of 3.07 km2. Water supply is from the river Brvenica- Batllavë. In order to take preventive measures and improve water quality, we have conducted periodic/monthly monitoring of water quality in Lake Batllava, through microbiological and physico-chemical indicators. The monitoring was carried out during the period December 2020 - December 2021. Samples were taken at three sampling sites: at the entrance of the lake, in the middle and at the overflow, on two levels, water surface and at a depth of 30 cm. The microbiological parameters analyzed are: total coliforms, fecal coliforms, fecal streptococci, aerobic mesophilic bacteria and actinomycetes. Within the physico-chemical parameters: Dissolved Oxygen, Saturation with O2, water temperature, pH value, electrical conductivity, total soluble matter, total suspended matter, turbidity, chemical oxygen demand, biochemical oxygen demand, total organic carbon, nitrate, total hardness, hardness of calcium, calcium, magnesium, ammonium ion, chloride, sulfates, flourine, M-alkalines, bicarbonates and heavy metals, such as: Fe, Pb, Mn, Cu, Cd. The results showed that most of the physico-chemical and microbiological parameters are within the limit allowed by the WHO, except in the case of the rainiest season that exceeded some parameters.

Keywords: batllava lake, monitoring of water, physico-chemical, microbiological, heavy metals

Procedia PDF Downloads 101
3243 Innovative Practices That Have Significantly Scaled up Depot Medroxy Progesterone Acetate-SC Self-Inject Services

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background The Delivering Innovations in Selfcare (DISC) project promotes universal access to quality selfcare services beginning with subcutaneous depot medroxy progesterone acetate (DMPA-SC) contraceptive self-injection (SI) option. Self-inject (SI) offers women a highly effective and convenient option that saves them frequent trips to providers. Its increased use has the potential to improve the efficiency of an overstretched healthcare system by reducing provider workloads. State Social and Behavioral Change Communications (SBCC) Officers lead project demand creation and service delivery innovations that have resulted in significant increases in SI uptake among women who opt for injectables. Strategies Service Delivery Innovations The implementation of the "Moment of Truth (MoT)" innovation helped providers overcome biases and address client fear and reluctance to self-inject. Bi-annual program audits and supportive mentoring visits helped providers retain their competence and motivation. Proper documentation, tracking, and replenishment of commodities were ensured through effective engagement with State Logistics Units. The project supported existing state monitoring and evaluation structures to effectively record and report subcutaneous depot medroxy progesterone acetate (DMPA-SC) service utilization. Demand creation Innovations SBCC Officers provide oversight, routinely evaluate performance, trains, and provides feedback for the demand creation activities implemented by community mobilizers (CMs). The scope and intensity of training given to CMs affect the outcome of their work. The project operates a demand creation model that uses a schedule to inform the conduct of interpersonal and group events. Health education sessions are specifically designed to counter misinformation, address questions and concerns, and educate target audience in an informed choice context. The project mapped facilities and their catchment areas and enlisted the support of identified influencers and gatekeepers to enlist their buy-in prior to entry. Each mobilization event began with pre-mobilization sensitization activities, particularly targeting male groups. Context-specific interventions were informed by the religious, traditional, and cultural peculiarities of target communities. Mobilizers also support clients to engage with and navigate online digital Family Planning (FP) online portals such as DiscoverYourPower website, Facebook page, digital companion (chat bot), interactive voice response (IVR), radio and television (TV) messaging. This improves compliance and provides linkages to nearby facilities. Results The project recorded 136,950 self-injection (SI) visits and a self-injection (SI) proportion rate that increased from 13 percent before the implementation of interventions in 2021 to 62 percent currently. The project cost-effectively demonstrated catalytic impact by leveraging state and partner resources, institutional platforms, and geographic scope to scale up interventions. The project also cost effectively demonstrated catalytic impact by leveraging on the state and partner resources, institutional platforms, and geographic scope to sustainably scale-up these strategies. Conclusion Using evidence-informed iterations of service delivery and demand creation models have been useful to significantly drive self-injection (SI) uptake. It will be useful to consider this implementation model during program design. Contemplation should also be given to systematic and strategic execution of strategies to optimize impact.

Keywords: family planning, contraception, DMPA-SC, self-care, self-injection, innovation, service delivery, demand creation.

Procedia PDF Downloads 71
3242 Endocardial Ultrasound Segmentation using Level Set method

Authors: Daoudi Abdelaziz, Mahmoudi Saïd, Chikh Mohamed Amine

Abstract:

This paper presents a fully automatic segmentation method of the left ventricle at End Systolic (ES) and End Diastolic (ED) in the ultrasound images by means of an implicit deformable model (level set) based on Geodesic Active Contour model. A pre-processing Gaussian smoothing stage is applied to the image, which is essential for a good segmentation. Before the segmentation phase, we locate automatically the area of the left ventricle by using a detection approach based on the Hough Transform method. Consequently, the result obtained is used to automate the initialization of the level set model. This initial curve (zero level set) deforms to search the Endocardial border in the image. On the other hand, quantitative evaluation was performed on a data set composed of 15 subjects with a comparison to ground truth (manual segmentation).

Keywords: level set method, transform Hough, Gaussian smoothing, left ventricle, ultrasound images.

Procedia PDF Downloads 460
3241 Democratization, Market Liberalization and the Raise of Vested Interests and Its Impacts on Anti-Corruption Reform in Indonesia

Authors: Ahmad Khoirul Umam

Abstract:

This paper investigates the role of vested interests and its impacts on anti-corruption agenda in Indonesia following the collapse of authoritarian regime in 1998. A pervasive and rampant corruption has been believed as the main cause of the state economy’s fragility. Hence, anti-corruption measures were implemented by applying democratization and market liberalization since the establishment of a consolidated democracy which go hand in hand with a liberal market economy is convinced to be an efficacious prescription for effective anti-corruption. The reform movement has also mandated the establishment of the independent, neutral and professional special anti-corruption agency namely Corruption Eradication Commission (KPK) to more intensify the fight against the systemic corruption. This paper will examine whether these anti-corruption measures have been effective to combat corruption, and investigate to what extend have the anti-corruption efforts, especially those conducted by KPK, been impeded by the emergence of a nexus of vested interests as the side-effect of democratization and market liberalization. Based on interviews with key stakeholders from KPK, other law enforcement agencies, government, prominent scholars, journalists and NGOs in Indonesia, it is found that since the overthrow of Soeharto, anti-corruption movement in the country have become more active and serious. After gradually winning the hearth of people, KPK successfully touched the untouchable corruption perpetrators who were previously protected by political immunity, legal protection and bureaucratic barriers. However, these changes have not necessarily reduced systemic and structural corruption practices. Ironically, intensive and devastating counterattacks were frequently posed by the alignment of business actors, elites of political parties, government, and also law enforcement agencies by hijacking state’s instruments to make KPK deflated, powerless, and surrender. This paper concludes that attempts of democratization, market liberalization and the establishment of anti-corruption agency may have helped Indonesia to reduce corruption. However, it is still difficult to imply that such anti-corruption measures have fostered the more effective anti-corruption works in the newly democratized and weakly regulated liberal economic system.

Keywords: vested interests, democratization, market liberalization, anti-corruption, Indonesia

Procedia PDF Downloads 226
3240 A Holistic Study of the Beta Lyrae Systems V0487 Lac, V0566 Hya and V0666 Lac

Authors: Moqbil S. Alenazi, Magdy. M. Elkhateeb

Abstract:

A comprehensive photometric study and evolutionary state for the newly discovered Beta Lyr systems V0487 Lac, V0566 Hya, and V0666 Lac were carried out by means of their first photometric observations. New times of minima were estimated from the observed light curves, and first (O-C) curves were established for all systems. A windows interface version of the Wilson and Devinney code (W-D) based on model atmospheres and a pass band prescription have been used for the radiative treatment. The accepted models reveal some absolute parameters for the studied systems, which are used in adopting the spectral type of the system's components and their evolutionary status. Distances to each system were calculated, and physical properties were estimated. Locations of the systems on the theoreticalmass–luminosity and mass–radius relations revealed a good fit for all systems components except for the secondary component of the system V0487 Lac.

Keywords: eclipsing binaries, light curve modelling, evolutionary state

Procedia PDF Downloads 70
3239 Definition of Service Angle of Android’S Robot Hand by Method of Small Movements of Gripper’S Axis Synthesis by Speed Vector

Authors: Valeriy Nebritov

Abstract:

The paper presents a generalized method for determining the service solid angle based on the assigned gripper axis orientation with a stationary grip center. Motion synthesis in this work is carried out in the vector of velocities. As an example, a solid angle of the android robot arm is determined, this angle being formed by the longitudinal axis of a gripper. The nature of the method is based on the study of sets of configuration positions, defining the end point positions of the unit radius sphere sweep, which specifies the service solid angle. From this the spherical curve specifying the shape of the desired solid angle was determined. The results of the research can be used in the development of control systems of autonomous android robots.

Keywords: android robot, control systems, motion synthesis, service angle

Procedia PDF Downloads 191
3238 Seismic Assessment of Passive Control Steel Structure with Modified Parameter of Oil Damper

Authors: Ahmad Naqi

Abstract:

Today, the passively controlled buildings are extensively becoming popular due to its excellent lateral load resistance circumstance. Typically, these buildings are enhanced with a damping device that has high market demand. Some manufacturer falsified the damping device parameter during the production to achieve the market demand. Therefore, this paper evaluates the seismic performance of buildings equipped with damping devices, which their parameter modified to simulate the falsified devices, intentionally. For this purpose, three benchmark buildings of 4-, 10-, and 20-story were selected from JSSI (Japan Society of Seismic Isolation) manual. The buildings are special moment resisting steel frame with oil damper in the longitudinal direction only. For each benchmark buildings, two types of structural elements are designed to resist the lateral load with and without damping devices (hereafter, known as Trimmed & Conventional Building). The target building was modeled using STERA-3D, a finite element based software coded for study purpose. Practicing the software one can develop either three-dimensional Model (3DM) or Lumped Mass model (LMM). Firstly, the seismic performance of 3DM and LMM models was evaluated and found excellent coincide for the target buildings. The simplified model of LMM used in this study to produce 66 cases for both of the buildings. Then, the device parameters were modified by ± 40% and ±20% to predict many possible conditions of falsification. It is verified that the building which is design to sustain the lateral load with support of damping device (Trimmed Building) are much more under threat as a result of device falsification than those building strengthen by damping device (Conventional Building).

Keywords: passive control system, oil damper, seismic assessment, lumped mass model

Procedia PDF Downloads 109
3237 An Explanatory Study Approach Using Artificial Intelligence to Forecast Solar Energy Outcome

Authors: Agada N. Ihuoma, Nagata Yasunori

Abstract:

Artificial intelligence (AI) techniques play a crucial role in predicting the expected energy outcome and its performance, analysis, modeling, and control of renewable energy. Renewable energy is becoming more popular for economic and environmental reasons. In the face of global energy consumption and increased depletion of most fossil fuels, the world is faced with the challenges of meeting the ever-increasing energy demands. Therefore, incorporating artificial intelligence to predict solar radiation outcomes from the intermittent sunlight is crucial to enable a balance between supply and demand of energy on loads, predict the performance and outcome of solar energy, enhance production planning and energy management, and ensure proper sizing of parameters when generating clean energy. However, one of the major problems of forecasting is the algorithms used to control, model, and predict performances of the energy systems, which are complicated and involves large computer power, differential equations, and time series. Also, having unreliable data (poor quality) for solar radiation over a geographical location as well as insufficient long series can be a bottleneck to actualization. To overcome these problems, this study employs the anaconda Navigator (Jupyter Notebook) for machine learning which can combine larger amounts of data with fast, iterative processing and intelligent algorithms allowing the software to learn automatically from patterns or features to predict the performance and outcome of Solar Energy which in turns enables the balance of supply and demand on loads as well as enhance production planning and energy management.

Keywords: artificial Intelligence, backward elimination, linear regression, solar energy

Procedia PDF Downloads 154
3236 The Nursing Profession in Algeria between Humane Treatment and Work Environment Problems - A Field Study

Authors: Bacha Zakaria

Abstract:

This study aimed to investigate the reality of humane treatment and work environment problems for nurses in public hospitals and their repercussions on the patients arriving there. In this curve, our field study was based on a sample of nurses in Algiers hospitals estimated at 100 nurses. The questionnaire prepared by the two researchers was applied face to face with the nurses, and after obtaining and analyzing the data, we concluded the most important results: The presence of many problems in the work environment, such as work pressures, lack of appreciation, verbal and physical violence, risk of infection, poor salary and incentives, working during fatigue, administrative problems etc. And accordingly, The embodiment of humane dealing with patients requires providing a humane work environment for nurses and dealing with them humanely so that they embody positive behaviors while dealing with patients.

Keywords: nursing, future, family-focused care, health equity

Procedia PDF Downloads 84
3235 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition

Authors: Aref Ghafouri, Mohammad javad Mollakazemi, Farhad Asadi

Abstract:

In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.

Keywords: frequency response, order of model reduction, frequency matching condition, nonlinear experimental data

Procedia PDF Downloads 396
3234 Potential Biosorption of Rhodococcus erythropolis, an Isolated Strain from Sossego Copper Mine, Brazil

Authors: Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Ingrid Avanzi, Elen A. Perpetuo

Abstract:

In this work, bacterial strains were isolated from environmental samples from a copper mine and three of them presented potential for bioremediation of copper. All the strains were identified by mass spectrometry (MALDI-TOF-Biotyper) and grown in three diferent media supplemented with 100 ppm of copper chloride in flasks of 500mL and it was incubated at 28 °C and 180 rpm. Periodically, samples were taken and monitored for cellular growth and copper biosorption by spectrophotometer UV-Vis (600 nm) and Inductively Coupled Plasma Optical Emission Spectrometry (ICP-OES), respectively. At the end of exponential phase of cellular growth, the biomass was utilized to construct a correlation curve between absorbance and dry mass of the cells. Among the three isolates with potential for biorremediation, 1 strain exhibit capacity the most for bioremediation of effluents contaminated by copper being identified as Rhodococcus erythropolis.

Keywords: bioprocess, bioremediation, biosorption, copper

Procedia PDF Downloads 380
3233 Fall Prevention: Evidence-Based Intervention in Exercise Program Implementation for Keeping Older Adults Safe and Active

Authors: Jennifer Holbein, Maritza Wiedel

Abstract:

Background: Aging is associated with an increased risk of falls in older adults, and as a result, falls have become public health crises. However, the incidence of falls can be reduced through healthy aging and the implementation of a regular exercise and strengthening program. Public health and healthcare professionals authorize the use of evidence‐based, exercise‐focused fall interventions, but there are major obstacles to translating and disseminating research findings into healthcare practices. The purpose of this study was to assess the feasibility of an intervention, A Matter of Balance, in terms of demand, acceptability, and implementation into current exercise programs. Subjects: Seventy-five participants from rural communities, above the age of sixty, were randomized to an intervention or attention-control of the standardized senior fitness test. Methods: Subject completes the intervention, which combines two components: (1) motivation and (2) fall-reducing physical activities with protocols derived from baseline strength and balanced assessments. Participants (n=75) took part in the program after completing baseline functional assessments as well as evaluations of their personal knowledge, health outcomes, demand, and implementation interventions. After 8-weeks of the program, participants were invited to complete follow-up assessments with results that were compared to their baseline functional analyses. Out of all the participants in the study who complete the initial assessment, approximately 80% are expected to maintain enrollment in the implemented prescription. Furthermore, those who commit to the program should show mitigation of fall risk upon completion of their final assessment.

Keywords: aging population, exercise, falls, functional assessment, healthy aging

Procedia PDF Downloads 98
3232 The Fragility of Sense: The Twofold Temporality of Embodiment and Its Role for Depression

Authors: Laura Bickel

Abstract:

This paper aims to investigate to what extent Merleau-Ponty’s philosophy of body memory serves as a viable resource for the enactive approach to cognitive science and its first-person experience-based research on ‘recurrent depressive disorder’ coded F33 in ICD-10. In pursuit of this goal, the analysis begins by revisiting the neuroreductive paradigm. This paradigm serves biological psychiatry to explain the condition of vital contact in terms of underlying neurophysiological mechanisms. It is demonstrated that the neuroreductive model cannot sufficiently account for the depressed person’s episodical withdrawal in causal terms. The analysis of the irregular loss of vital resonance requires integrating the body as the subject of experience and its phenomenological time. Then, it is shown that the enactive approach to depression as disordered sense-making is a promising alternative. The enactive model of perception implies that living beings do not register pre-existing meaning ‘out there’ but unfold ‘sense’ in their action-oriented response to the world. For the enactive approach, Husserl’s passive synthesis of inner time consciousness is fundamental for what becomes perceptually present for action. It seems intuitive to bring together the enactive approach to depression with the long-standing view in phenomenological psychopathology that explains the loss of vital contact by appealing to the disruption of the temporal structure of consciousness. However, this paper argues that the disruption of the temporal structure is not justified conceptually. Instead, one may integrate Merleau-Ponty’s concept of the past as the unconscious into the enactive approach to depression. From this perspective, the living being’s experiential and biological past inserts itself in the form of habit and bodily skills and ensures action-oriented responses to the environment. Finally, it is concluded that the depressed person’s withdrawal indicates the impairment of this application process. The person suffering from F33 cannot actualize sedimented meaning to respond to the valences and tasks of a given situation.

Keywords: depression, enactivism, neuroreductionsim, phenomenology, temporality

Procedia PDF Downloads 125
3231 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids

Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone

Abstract:

Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.

Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain

Procedia PDF Downloads 459
3230 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 50
3229 Borrower Discouragement in Spain: An Empirical Analysis Using a Survey Data Set

Authors: Ginés Hernández-Cánovas, Mª Camino Ramón-Llorens, Johanna Koëter-Kant

Abstract:

This paper uses a survey data-set of 837 Spanish SMEs to analyze the association between borrower discouragement and prior firm´s strategic decisions, while controlling for firm and owner characteristics. While existing literature has neglected factors limiting the demand for resources by an overreliance on arguments which attempt to explain the existence of discouraged borrowers solely in terms of lack of access to supply of credit. The objective of this paper is to show that factors limiting the demand for resources and, therefore, reducing the availability of funds, can be traced back to the firm manager´s decision. Our hypothesis is that managers that undertake strategic decisions seeking growth or improvement in their business performance participate more in the banking market than those showing contentment with their current business situation. Our results shows that SMEs that undertake an active role in research and development activities and that achieve improvements in the operating performance of their business are less likely to be discouraged from applying for a loan. Who needs credit and who applies for credit is important for firms, prospective lenders and policymakers interested in the financial health of these firms. Credit constrained firms are less likely to invest in R&D and to introduce new products, possibly harming long-term economic growth. Knowing how important borrower discouragement is in Europe, is important for judging the priority which should be attached to government policies aimed at reducing its effects. For example, policy makers could encourage the transparency about credit eligibility and conditions in order to reduce discouragement.

Keywords: discouragement, financial constraints, SMEs financing

Procedia PDF Downloads 348
3228 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 124
3227 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging

Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi

Abstract:

Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.

Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA

Procedia PDF Downloads 274
3226 Modelling for Temperature Non-Isothermal Continuous Stirred Tank Reactor Using Fuzzy Logic

Authors: Nasser Mohamed Ramli, Mohamad Syafiq Mohamad

Abstract:

Many types of controllers were applied on the continuous stirred tank reactor (CSTR) unit to control the temperature. In this research paper, Proportional-Integral-Derivative (PID) controller are compared with Fuzzy Logic controller for temperature control of CSTR. The control system for temperature non-isothermal of a CSTR will produce a stable response curve to its set point temperature. A mathematical model of a CSTR using the most general operating condition was developed through a set of differential equations into S-function using MATLAB. The reactor model and S-function are developed using m.file. After developing the S-function of CSTR model, User-Defined functions are used to link to SIMULINK file. Results that are obtained from simulation and temperature control were better when using Fuzzy logic control compared to PID control.

Keywords: CSTR, temperature, PID, fuzzy logic

Procedia PDF Downloads 451
3225 Management of Cultural Heritage: Bologna Gates

Authors: Alfonso Ippolito, Cristiana Bartolomei

Abstract:

A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.

Keywords: cultural heritage, databases, non-contact survey, 2D-3D models

Procedia PDF Downloads 417
3224 Quick Covering Machine for Grain Drying Pavement

Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug

Abstract:

In sundrying, the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack, conducting partial budget, and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0 .53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.

Keywords: quick, covering machine, grain, drying pavement

Procedia PDF Downloads 366
3223 ADP Approach to Evaluate the Blood Supply Network of Ontario

Authors: Usama Abdulwahab, Mohammed Wahab

Abstract:

This paper presents the application of uncapacitated facility location problems (UFLP) and 1-median problems to support decision making in blood supply chain networks. A plethora of factors make blood supply-chain networks a complex, yet vital problem for the regional blood bank. These factors are rapidly increasing demand; criticality of the product; strict storage and handling requirements; and the vastness of the theater of operations. As in the UFLP, facilities can be opened at any of $m$ predefined locations with given fixed costs. Clients have to be allocated to the open facilities. In classical location models, the allocation cost is the distance between a client and an open facility. In this model, the costs are the allocation cost, transportation costs, and inventory costs. In order to address this problem the median algorithm is used to analyze inventory, evaluate supply chain status, monitor performance metrics at different levels of granularity, and detect potential problems and opportunities for improvement. The Euclidean distance data for some Ontario cities (demand nodes) are used to test the developed algorithm. Sitation software, lagrangian relaxation algorithm, and branch and bound heuristics are used to solve this model. Computational experiments confirm the efficiency of the proposed approach. Compared to the existing modeling and solution methods, the median algorithm approach not only provides a more general modeling framework but also leads to efficient solution times in general.

Keywords: approximate dynamic programming, facility location, perishable product, inventory model, blood platelet, P-median problem

Procedia PDF Downloads 503
3222 Selective Extraction of Lithium from Native Geothermal Brines Using Lithium-ion Sieves

Authors: Misagh Ghobadi, Rich Crane, Karen Hudson-Edwards, Clemens Vinzenz Ullmann

Abstract:

Lithium is recognized as the critical energy metal of the 21st century, comparable in importance to coal in the 19th century and oil in the 20th century, often termed 'white gold'. Current global demand for lithium, estimated at 0.95-0.98 million metric tons (Mt) of lithium carbonate equivalent (LCE) annually in 2024, is projected to rise to 1.87 Mt by 2027 and 3.06 Mt by 2030. Despite anticipated short-term stability in supply and demand, meeting the forecasted 2030 demand will require the lithium industry to develop an additional capacity of 1.42 Mt of LCE annually, exceeding current planned and ongoing efforts. Brine resources constitute nearly 65% of global lithium reserves, underscoring the importance of exploring lithium recovery from underutilized sources, especially geothermal brines. However, conventional lithium extraction from brine deposits faces challenges due to its time-intensive process, low efficiency (30-50% lithium recovery), unsuitability for low lithium concentrations (<300 mg/l), and notable environmental impacts. Addressing these challenges, direct lithium extraction (DLE) methods have emerged as promising technologies capable of economically extracting lithium even from low-concentration brines (>50 mg/l) with high recovery rates (75-98%). However, most studies (70%) have predominantly focused on synthetic brines instead of native (natural/real), with limited application of these approaches in real-world case studies or industrial settings. This study aims to bridge this gap by investigating a geothermal brine sample collected from a real case study site in the UK. A Mn-based lithium-ion sieve (LIS) adsorbent was synthesized and employed to selectively extract lithium from the sample brine. Adsorbents with a Li:Mn molar ratio of 1:1 demonstrated superior lithium selectivity and adsorption capacity. Furthermore, the pristine Mn-based adsorbent was modified through transition metals doping, resulting in enhanced lithium selectivity and adsorption capacity. The modified adsorbent exhibited a higher separation factor for lithium over major co-existing cations such as Ca, Mg, Na, and K, with separation factors exceeding 200. The adsorption behaviour was well-described by the Langmuir model, indicating monolayer adsorption, and the kinetics followed a pseudo-second-order mechanism, suggesting chemisorption at the solid surface. Thermodynamically, negative ΔG° values and positive ΔH° and ΔS° values were observed, indicating the spontaneity and endothermic nature of the adsorption process.

Keywords: adsorption, critical minerals, DLE, geothermal brines, geochemistry, lithium, lithium-ion sieves

Procedia PDF Downloads 37
3221 Biogas Separation, Alcohol Amine Solutions

Authors: Jingxiao Liang, David Rooneyman

Abstract:

Biogas, which is a valuable renewable energy source, can be produced by anaerobic fermentation of agricultural waste, manure, municipal waste, plant material, sewage, green waste, or food waste. It is composed of methane (CH4) and carbon dioxide (CO2) but also contains significant quantities of undesirable compounds such as hydrogen sulfide (H2S), ammonia (NH3), and siloxanes. Since typical raw biogas contains 25–45% CO2, The requirements for biogas quality depend on its further application. Before biogas is being used more efficiently, CO2 should be removed. One of the existing options for biogas separation technologies is based on chemical absorbents, in particular, mono-, di- and tri-alcohol amine solutions. Such amine solutions have been applied as highly efficient CO2 capturing agents. The benchmark in this experiment is N-methyldiethanolamine (MDEA) with piperazine (PZ) as an activator, from CO2 absorption Isotherm curve, optimization conditions are collected, such as activator percentage, temperature etc. This experiment makes new alcohol amines, which could have the same CO2 absorbing ability as activated MDEA, using glycidol as one of reactant, the result is quite satisfying.

Keywords: biogas, CO2, MDEA, separation

Procedia PDF Downloads 623
3220 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker

Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.

Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation

Procedia PDF Downloads 12