Search results for: contention resolution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1476

Search results for: contention resolution

1266 Opportunities for Effective Conflict Management Caused by Global Crises

Authors: Marine Kobalava

Abstract:

The article analyzes current global crises in the world, explains the causes of crises, substantiates that in the main cases the process accompanying the crisis are conflict situations. The paper argues that crises can become predictable if threats are identified and addressed by a company, organization, corporation, and others. Accordingly, mechanisms for the neutralization of conflict potential are proposed, the need to develop a communication strategy and create and redistribute information flows is justified. Conflict situations are assessed according to the types of crisis and it is considered that the conflict can become a prerequisite for the crisis. The paper substantiates the need to differentiate theories of crises and conflicts. Based on the evaluative judgment, conflict management measures are proposed taking into account institutionalization, conflict resolution norms and rules. The paper identifies the potential for conflicts created in the context of global crises and suggests local ways and mechanisms for their effective management. The involvement of the company's Public relations (PR) and relevant communication from the qualified staff is considered important. Conclusions are drawn on the problems of effective conflict management caused by global crises and recommendations for conflict resolution have been proposed.

Keywords: global crises, conflict situations, conflict identification, conflict management, conflict potential

Procedia PDF Downloads 107
1265 Gender Difference and Conflict Management Strategy Preference among Managers in Public Organizations in South-Western Nigeria

Authors: D. I. Akintayo, C. O. Aje

Abstract:

This study investigated the moderating influence of gender difference and conflict resolution strategy preference on managers` efficiency in managing industrial conflict in work organizations in South-Western Nigeria. This was for the purpose of ascertaining the relevance of gender difference and conflict resolution strategy preference to managerial efficiency towards ensuring sustainable industrial peace and harmonious labour-management relations at workplaces in Nigeria. Descriptive ex-post-facto research design was adopted for the study. A total of 185 respondents were selected for the study using purposive stratified sampling technique. A set of questionnaire titled ‘Rahim Organizational Conflict Inventory’ (ROCI) and Managerial Conflict Efficiency Scale (MCES) were adopted for the study. The three generated hypotheses were tested using Pearson Product Moment Correlation and t-test statistical methods. The findings of the study revealed that: A significant relationship exists between gender difference and conflict management preference of the managers(r = 0.644; P < 0.05). I t was also found that there was no significant difference between male and female managers’ conflict management strategy preference (t (181) = 11.08; P > 0.05).The finding reveals that there is no significant difference between female and male managers’ conflict management efficiency on the basis of conflict management preference of the managers (t (181) = 10.23; P > 0.05). Based on the findings of the study, it is recommended that collective bargaining strategy should be encouraged as conflict resolution strategy in order to guarantee effective management of industrial conflict and harmonious labour-management relations. Also, both male and female managers should be empowered to be appointed to managerial positions and should avoid the use of coercion, competition, aggressiveness and pro-task in the course of managing industrial conflict. Rather, persuasion, compromising, relational, lobbying and participatory approaches should be employed during collective bargaining process in order to foster effective management of conflict at workplaces.

Keywords: conflict management, gender difference, managerial studies, public organization and managers, strategy preference

Procedia PDF Downloads 410
1264 Derivation of Bathymetry Data Using Worldview-2 Multispectral Images in Shallow, Turbid and Saline Lake Acıgöl

Authors: Muhittin Karaman, Murat Budakoglu

Abstract:

In this study, derivation of lake bathymetry was evaluated using the high resolution Worldview-2 multispectral images in the very shallow hypersaline Lake Acıgöl which does not have a stable water table due to the wet-dry season changes and industrial usage. Every year, a great part of the lake water budget has been consumed for the industrial salt production in the evaporation ponds, which are generally located on the south and north shores of Lake Acıgöl. Therefore, determination of the water level changes from a perspective of remote sensing-based lake water by bathymetry studies has a great importance in the sustainability-control of the lake. While the water table interval is around 1 meter between dry and wet season, dissolved ion concentration, salinity and turbidity also show clear differences during these two distinct seasonal periods. At the same time, with the satellite data acquisition (June 9, 2013), a field study was conducted to collect the salinity values, Secchi disk depths and turbidity levels. Max depth, Secchi disk depth and salinity were determined as 1,7 m, 0,9 m and 43,11 ppt, respectively. Eight-band Worldview-2 image was corrected for atmospheric effects by ATCOR technique. For each sampling point in the image, mean reflectance values in 1*1, 3*3, 5*5, 7*7, 9*9, 11*11, 13*13, 15*15, 17*17, 19*19, 21*21, 51*51 pixel reflectance neighborhoods were calculated separately. A unique image has been derivated for each matrix resolution. Spectral values and depth relation were evaluated for these distinct resolution images. Correlation coefficients were determined for the 1x1 matrix: 0,98, 0,96, 0,95 and 0,90 for the 724 nm, 831 nm, 908 nm and 659 nm, respectively. While 15x5 matrix characteristics with 0,98, 0,97 and 0,97 correlation values for the 724 nm, 908 nm and 831 nm, respectively; 51x51 matrix shows 0,98, 0,97 and 0,96 correlation values for the 724 nm, 831 nm and 659 nm, respectively. Comparison of all matrix resolutions indicates that RedEdge band (724 nm) of the Worldview-2 satellite image has the best correlation with the saline shallow lake of Acıgöl in-situ depth.

Keywords: bathymetry, Worldview-2 satellite image, ATCOR technique, Lake Acıgöl, Denizli, Turkey

Procedia PDF Downloads 404
1263 A Phenomenological-Hermeneutic Account of Design Thinking by Way of an Exposition of Four Species of Negatite: 'Not Being', 'Non-Being', 'Absence', 'Non-Existence'

Authors: Soheil Ashrafi

Abstract:

In this paper, it is attempted to chart and exposit terra incognito of the transcendental intuition of ‘non-being’, a peculiar species of négatité and a form of consciousness which underpins the phenomenal capacity for design thinking, and which serves as the ground of the ‘designing being-relation to the world’. The paper’s contention is that the transcendental intuition of the non-being indwells the agent’s being-relation to the world as a continual tension in that neither does the agent relinquish its ontological leverage and submit altogether to the world’s curbs and dictates, nor is it able to subdue satisfactorily or settle into the world once and for all. By way of phenomenological-hermeneutic analysis, it is endeavoured to argue that design thinking occurs by virtue of a phenomenal transition between the a priori ‘not-being’, the basis of ‘that-which-is’, and the transcendental intuition of non-being through which that-which-is-not-yet announces itself. Along with this, the other two species of négatité as ‘absence’ and ‘non-existence’ are clarified and contrasted with not-being and non-being, which have widely been used in the literature interchangeably as identical terms. In conclusion, it is argued that not only has design thinking in its unadulterated, originary mode historically preceded scientific thinking, but it also has served as the foundation of its emergence. In short, scientific thinking is a derivative, reformed application of design thinking; it indeed supervenes upon it.

Keywords: design thinking, designing being-relation to the world, négatité, not-being, non-being

Procedia PDF Downloads 144
1262 Regional Changes under Extreme Meteorological Events

Authors: Renalda El Samra, Elie Bou-Zeid, Hamza Kunhu Bangalath, Georgiy Stenchikov, Mutasem El Fadel

Abstract:

The regional-scale impact of climate change over complex terrain was examined through high-resolution dynamic downscaling conducted using the Weather Research and Forecasting (WRF) model, with initial and boundary conditions from a High-Resolution Atmospheric Model (HiRAM). The analysis was conducted over the eastern Mediterranean, with a focus on the country of Lebanon, which is characterized by a challenging complex topography that magnifies the effect of orographic precipitation. Four year-long WRF simulations, selected based on HiRAM time series, were performed to generate future climate projections of extreme temperature and precipitation over the study area under the conditions of the Representative Concentration Pathway (RCP) 4.5. One past WRF simulation year, 2008, was selected as a baseline to capture dry extremes of the system. The results indicate that the study area might be exposed to a temperature increase between 1.0 and 3ºC in summer mean values by 2050, in comparison to 2008. For extreme years, the decrease in average annual precipitation may exceed 50% at certain locations in comparison to 2008.

Keywords: HiRAM, regional climate modeling, WRF, Representative Concentration Pathway (RCP)

Procedia PDF Downloads 369
1261 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study

Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar

Abstract:

Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.

Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices

Procedia PDF Downloads 472
1260 Recognition of Objects in a Maritime Environment Using a Combination of Pre- and Post-Processing of the Polynomial Fit Method

Authors: R. R. Hordijk, O. J. G. Somsen

Abstract:

Traditionally, radar systems are the eyes and ears of a ship. However, these systems have their drawbacks and nowadays they are extended with systems that work with video and photos. Processing of data from these videos and photos is however very labour-intensive and efforts are being made to automate this process. A major problem when trying to recognize objects in water is that the 'background' is not homogeneous so that traditional image recognition technics do not work well. Main question is, can a method be developed which automate this recognition process. There are a large number of parameters involved to facilitate the identification of objects on such images. One is varying the resolution. In this research, the resolution of some images has been reduced to the extreme value of 1% of the original to reduce clutter before the polynomial fit (pre-processing). It turned out that the searched object was clearly recognizable as its grey value was well above the average. Another approach is to take two images of the same scene shortly after each other and compare the result. Because the water (waves) fluctuates much faster than an object floating in the water one can expect that the object is the only stable item in the two images. Both these methods (pre-processing and comparing two images of the same scene) delivered useful results. Though it is too early to conclude that with these methods all image problems can be solved they are certainly worthwhile for further research.

Keywords: image processing, image recognition, polynomial fit, water

Procedia PDF Downloads 506
1259 Reconstruction of Signal in Plastic Scintillator of PET Using Tikhonov Regularization

Authors: L. Raczynski, P. Moskal, P. Kowalski, W. Wislicki, T. Bednarski, P. Bialas, E. Czerwinski, A. Gajos, L. Kaplon, A. Kochanowski, G. Korcyl, J. Kowal, T. Kozik, W. Krzemien, E. Kubicz, Sz. Niedzwiecki, M. Palka, Z. Rudy, O. Rundel, P. Salabura, N.G. Sharma, M. Silarski, A. Slomski, J. Smyrski, A. Strzelecki, A. Wieczorek, M. Zielinski, N. Zon

Abstract:

The J-PET scanner, which allows for single bed imaging of the whole human body, is currently under development at the Jagiellonian University. The J-PET detector improves the TOF resolution due to the use of fast plastic scintillators. Since registration of the waveform of signals with duration times of few nanoseconds is not feasible, a novel front-end electronics allowing for sampling in a voltage domain at four thresholds was developed. To take fully advantage of these fast signals a novel scheme of recovery of the waveform of the signal, based on ideas from the Tikhonov regularization (TR) and Compressive Sensing methods, is presented. The prior distribution of sparse representation is evaluated based on the linear transformation of the training set of waveform of the signals by using the Principal Component Analysis (PCA) decomposition. Beside the advantage of including the additional information from training signals, a further benefit of the TR approach is that the problem of signal recovery has an optimal solution which can be determined explicitly. Moreover, from the Bayes theory the properties of regularized solution, especially its covariance matrix, may be easily derived. This step is crucial to introduce and prove the formula for calculations of the signal recovery error. It has been proven that an average recovery error is approximately inversely proportional to the number of samples at voltage levels. The method is tested using signals registered by means of the single detection module of the J-PET detector built out from the 30 cm long BC-420 plastic scintillator strip. It is demonstrated that the experimental and theoretical functions describing the recovery errors in the J-PET scenario are largely consistent. The specificity and limitations of the signal recovery method in this application are discussed. It is shown that the PCA basis offers high level of information compression and an accurate recovery with just eight samples, from four voltage levels, for each signal waveform. Moreover, it is demonstrated that using the recovered waveform of the signals, instead of samples at four voltage levels alone, improves the spatial resolution of the hit position reconstruction. The experiment shows that spatial resolution evaluated based on information from four voltage levels, without a recovery of the waveform of the signal, is equal to 1.05 cm. After the application of an information from four voltage levels to the recovery of the signal waveform, the spatial resolution is improved to 0.94 cm. Moreover, the obtained result is only slightly worse than the one evaluated using the original raw-signal. The spatial resolution calculated under these conditions is equal to 0.93 cm. It is very important information since, limiting the number of threshold levels in the electronic devices to four, leads to significant reduction of the overall cost of the scanner. The developed recovery scheme is general and may be incorporated in any other investigation where a prior knowledge about the signals of interest may be utilized.

Keywords: plastic scintillators, positron emission tomography, statistical analysis, tikhonov regularization

Procedia PDF Downloads 415
1258 Approach to Formulate Intuitionistic Fuzzy Regression Models

Authors: Liang-Hsuan Chen, Sheng-Shing Nien

Abstract:

This study aims to develop approaches to formulate intuitionistic fuzzy regression (IFR) models for many decision-making applications in the fuzzy environments using intuitionistic fuzzy observations. Intuitionistic fuzzy numbers (IFNs) are used to characterize the fuzzy input and output variables in the IFR formulation processes. A mathematical programming problem (MPP) is built up to optimally determine the IFR parameters. Each parameter in the MPP is defined as a couple of alternative numerical variables with opposite signs, and an intuitionistic fuzzy error term is added to the MPP to characterize the uncertainty of the model. The IFR model is formulated based on the distance measure to minimize the total distance errors between estimated and observed intuitionistic fuzzy responses in the MPP resolution processes. The proposed approaches are simple/efficient in the formulation/resolution processes, in which the sign of parameters can be determined so that the problem to predetermine the sign of parameters is avoided. Furthermore, the proposed approach has the advantage that the spread of the predicted IFN response will not be over-increased, since the parameters in the established IFR model are crisp. The performance of the obtained models is evaluated and compared with the existing approaches.

Keywords: fuzzy sets, intuitionistic fuzzy number, intuitionistic fuzzy regression, mathematical programming method

Procedia PDF Downloads 111
1257 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 334
1256 Topographic Coast Monitoring Using UAV Photogrammetry: A Case Study in Port of Veracruz Expansion Project

Authors: Francisco Liaño-Carrera, Jorge Enrique Baños-Illana, Arturo Gómez-Barrero, José Isaac Ramírez-Macías, Erik Omar Paredes-JuáRez, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga

Abstract:

Topographical changes in coastal areas are usually assessed with airborne LIDAR and conventional photogrammetry. In recent times Unmanned Aerial Vehicles (UAV) have been used several in photogrammetric applications including coastline evolution. However, its use goes further by using the points cloud associated to generate beach Digital Elevation Models (DEM). We present a methodology for monitoring coastal topographic changes along a 50 km coastline in Veracruz, Mexico using high-resolution images (less than 10 cm ground resolution) and dense points cloud captured with an UAV. This monitoring develops in the context of the port of Veracruz expansion project which construction began in 2015 and intends to characterize coast evolution and prevent and mitigate project impacts on coastal environments. The monitoring began with a historical coastline reconstruction since 1979 to 2015 using aerial photography and Landsat imagery. We could define some patterns: the northern part of the study area showed accretion while the southern part of the study area showed erosion. Since the study area is located off the port of Veracruz, a touristic and economical Mexican urban city, where coastal development structures have been built since 1979 in a continuous way, the local beaches of the touristic area are been refilled constantly. Those areas were not described as accretion since every month sand-filled trucks refill the sand beaches located in front of the hotel area. The construction of marinas and the comitial port of Veracruz, the old and the new expansion were made in the erosion part of the area. Northward from the City of Veracruz the beaches were described as accretion areas while southward from the city, the beaches were described as erosion areas. One of the problems is the expansion of the new development in the southern area of the city using the beach view as an incentive to buy front beach houses. We assessed coastal changes between seasons using high-resolution images and also points clouds during 2016 and preliminary results confirm that UAVs can be used in permanent coast monitoring programs with excellent performance and detail.

Keywords: digital elevation model, high-resolution images, topographic coast monitoring, unmanned aerial vehicle

Procedia PDF Downloads 242
1255 Barriers to Competitive Tenders in Building Conservation Works

Authors: Yoke-Mui Lim, Yahaya Ahmad

Abstract:

Conservation works in Malaysia that is procured by public organisation usually follow the traditional approach where the works are tendered based on Bills of Quantities (BQ). One of the purposes of tendering is to enable the selection of a competent contractor that offers a competitive price. While competency of the contractors are assessed by their technical knowledge, experience and track records, the assessment of pricing will be dependent on the tender amount. However, the issue currently faced by the conservation works sector is the difficulty in assessing the competitiveness and reasonableness of the tender amount due to the high variance between the tenders amount. Thus, this paper discusses the factors that cause difficulty to the tenderers in pricing competitively in a bidding exercise for conservation tenders. Data on tendering is collected from interviews with conservation works contractors to gain in-depth understanding of the barriers faced in pricing tenders of conservation works. Findings from the study lent support to the contention that the variance of tender amount is very high amongst tenderers. The factors identified in the survey are the format of BQ, hidden works, experience and labour and material costs.

Keywords: building conservation, Malaysia, bill of quantities, tender

Procedia PDF Downloads 345
1254 Application of Hyperspectral Remote Sensing in Sambhar Salt Lake, A Ramsar Site of Rajasthan, India

Authors: Rajashree Naik, Laxmi Kant Sharma

Abstract:

Sambhar lake is the largest inland Salt Lake of India, declared as a Ramsar site on 23 March 1990. Due to high salinity and alkalinity condition its biodiversity richness is contributed by haloalkaliphilic flora and fauna along with the diverse land cover including waterbody, wetland, salt crust, saline soil, vegetation, scrub land and barren land which welcome large number of flamingos and other migratory birds for winter harboring. But with the gradual increase in the irrational salt extraction activities, the ecological diversity is at stake. There is an urgent need to assess the ecosystem. Advanced technology like remote sensing and GIS has enabled to look into the past, compare with the present for the future planning and management of the natural resources in a judicious way. This paper is a research work intended to present a vegetation in typical inland lake environment of Sambhar wetland using satellite data of NASA’s EO-1 Hyperion sensor launched in November 2000. With the spectral range of 0.4 to 2.5 micrometer at approximately 10nm spectral resolution with 242 bands 30m spatial resolution and 705km orbit was used to produce a vegetation map for a portion of the wetland. The vegetation map was tested for classification accuracy with a pre-existing detailed GIS wetland vegetation database. Though the accuracy varied greatly for different classes the algal communities were successfully identified which are the major sources of food for flamingo. The results from this study have practical implications for uses of spaceborne hyperspectral image data that are now becoming available. Practical limitations of using these satellite data for wetland vegetation mapping include inadequate spatial resolution, complexity of image processing procedures, and lack of stereo viewing.

Keywords: Algal community, NASA’s EO-1 Hyperion, salt-tolerant species, wetland vegetation mapping

Procedia PDF Downloads 103
1253 A Study of ZY3 Satellite Digital Elevation Model Verification and Refinement with Shuttle Radar Topography Mission

Authors: Bo Wang

Abstract:

As the first high-resolution civil optical satellite, ZY-3 satellite is able to obtain high-resolution multi-view images with three linear array sensors. The images can be used to generate Digital Elevation Models (DEM) through dense matching of stereo images. However, due to the clouds, forest, water and buildings covered on the images, there are some problems in the dense matching results such as outliers and areas failed to be matched (matching holes). This paper introduced an algorithm to verify the accuracy of DEM that generated by ZY-3 satellite with Shuttle Radar Topography Mission (SRTM). Since the accuracy of SRTM (Internal accuracy: 5 m; External accuracy: 15 m) is relatively uniform in the worldwide, it may be used to improve the accuracy of ZY-3 DEM. Based on the analysis of mass DEM and SRTM data, the processing can be divided into two aspects. The registration of ZY-3 DEM and SRTM can be firstly performed using the conjugate line features and area features matched between these two datasets. Then the ZY-3 DEM can be refined by eliminating the matching outliers and filling the matching holes. The matching outliers can be eliminated based on the statistics on Local Vector Binning (LVB). The matching holes can be filled by the elevation interpolated from SRTM. Some works are also conducted for the accuracy statistics of the ZY-3 DEM.

Keywords: ZY-3 satellite imagery, DEM, SRTM, refinement

Procedia PDF Downloads 315
1252 Examination of the Influence of the Near-Surface Geology on the Initial Infrastructural Development Using High-Resolution Seismic Method

Authors: Collins Chiemeke, Stephen Ibe, Godwin Onyedim

Abstract:

This research work on high-resolution seismic tomography method was carried out with the aim of investigating how near-surface geology influences the initial distribution of infrastructural development in an area like Otuoke and its environs. To achieve this objective, seismic tomography method was employed. The result revealed that the overburden (highly-weathered layer) thickness ranges from 27 m to 50 m within the survey area, with an average value of 37 m. The 3D surface analysis for the overburden thickness distribution within the survey area showed that the thickness of the overburden is more in regions with less infrastructural development, and least in built-up areas. The range of velocity distribution from the surface to within a depth of 5 m is about 660 m/s to 1160 m/s, with an average value of 946 m/s. The 3D surface analysis of the velocity distribution also revealed that the areas with large infrastructural development are characterized with large velocity values compared with the undeveloped regions that has average low-velocity values. Hence, one can conclusively say that the initial settlement of Otuoke and its environs and the subsequent infrastructural development was influenced by the underlying near surface geology (rigid earth), among other factors.

Keywords: geology, seismic, infrastructural, near-surface

Procedia PDF Downloads 263
1251 Israel versus Palestine: Politological and Depth-Psychological Aspects

Authors: Harald Haas, Andrea Plaschke

Abstract:

Many of the contemporary major conflicts on this earth could not be solved so far, they either are perpetuated, or they are reflated again and again. Efforts of purely political conflict management or -resolution aim merely at the symptoms of conflict, not its roots. These roots are, in almost every case, also psychological ones. Thus, this contribution aims to shed light on the roots of one of the best known and longest-lasting conflicts: the Palestinian-Israeli one. Methodologies used were the compilation of existing scientific resources, field research in Palestine and Israel, as well as tests conducted with the Adult Attachment Projective in Palestine and Israel. Findings show that the majority of Palestinian, as well as Israeli test participants, show a disorganised attachment pattern which, in connection with the assumption of collective traumatization, seem to be a major obstacle to a lasting and peaceful conflict-resolution between these two peoples. There appears to be no short-term solution for this conflict, especially not within the range of usual Western legislative periods. Both sides ought to be provided with a kind of 'safe haven' over a long period of time, accompanied by a framework of various arrangements of coping with trauma, building lasting and secure relationships, as well as raising and educating present and future generations of Palestinians and Israelis for peace and co-operation with each other.

Keywords: conflict-management, trauma, political psychology, attachment theory

Procedia PDF Downloads 175
1250 Quantitative Comparisons of Different Approaches for Rotor Identification

Authors: Elizabeth M. Annoni, Elena G. Tolkacheva

Abstract:

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.

Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors

Procedia PDF Downloads 299
1249 High-Resolution Computed Tomography Imaging Features during Pandemic 'COVID-19'

Authors: Sahar Heidary, Ramin Ghasemi Shayan

Abstract:

By the development of new coronavirus (2019-nCoV) pneumonia, chest high-resolution computed tomography (HRCT) has been one of the main investigative implements. To realize timely and truthful diagnostics, defining the radiological features of the infection is of excessive value. The purpose of this impression was to consider the imaging demonstrations of early-stage coronavirus disease 2019 (COVID-19) and to run an imaging base for a primary finding of supposed cases and stratified interference. The right prophetic rate of HRCT was 85%, sensitivity was 73% for all patients. Total accuracy was 68%. There was no important change in these values for symptomatic and asymptomatic persons. These consequences were besides free of the period of X-ray from the beginning of signs or interaction. Therefore, we suggest that HRCT is a brilliant attachment for early identification of COVID-19 pneumonia in both symptomatic and asymptomatic individuals in adding to the role of predictive gauge for COVID-19 pneumonia. Patients experienced non-contrast HRCT chest checkups and images were restored in a thin 1.25 mm lung window. Images were estimated for the existence of lung scratches & a CT severity notch was allocated separately for each patient based on the number of lung lobes convoluted.

Keywords: COVID-19, radiology, respiratory diseases, HRCT

Procedia PDF Downloads 118
1248 Illuminating Regional Identity: An Interdisciplinary Exploration in Saskatchewan

Authors: Anne Gibbons

Abstract:

Both inside and outside of academia, people have sought to understand the “sense of place” of various regions, many times over and for many different reasons. The concept of regional identity is highly complex and surrounded by considerable contention. There are multiple bodies of research on regional identity theory in many different disciplines and even across sub-disciplinary classifications. Each discipline takes a slightly different angle or perspective on regional identity, resulting in a fragmented body of work on this topic overall. There is a need to consolidate this body of increasingly fragmented theory through interdisciplinary integration. For the purpose of this study, the province of Saskatchewan will serve as an exemplar for exploring regional identity in a concrete context. Saskatchewan can be thought of as a ‘functional region,’ with clear boundaries and clear residency, from which regional identity can be studied. This thesis shares the outcomes of a qualitative study grounded in a series of group interviews with askatchewan residents, from which it is concluded that the use of interdisciplinary theory is an appropriate approach to the study of regional identity. Regional identity cannot be compartmentalized; it is a web of characteristics, attributes, and feelings that are inextricably linked. The thesis thus concludes by offering lessons learned about how we might better understand regional identity, as illuminated through both interdisciplinary theory and the lived experiences and imaginations of people living in the region of Saskatchewan.

Keywords: interdisciplinary, regional identity, Saskatchewan, tourism studies

Procedia PDF Downloads 503
1247 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 31
1246 Narrative Psychology and Its Role in Illuminating the Experience of Suffering

Authors: Maureen Gibney

Abstract:

The examination of narrative in psychology has a long tradition, starting with psychoanalytic theory and embracing over time cognitive, social, and personality psychology, among others. Narrative use has been richly detailed as well in medicine, nursing, and social service. One aspect of narrative that has ready utility in higher education and in clinical work is the exploration of suffering and its meaning. Because it is such a densely examined topic, suffering provides a window into identity, sense of purpose, and views of humanity and of the divine. Storytelling analysis permits an exploration of a host of specific manifestations of suffering such as pain and illness, moral injury, and the impact of prolonged suffering on love and relationships. This presentation will review the origins and current understandings of narrative theory in general, and will draw from psychology, medicine, ethics, nursing, and social service in exploring the topic of suffering in particular. It is suggested that the use of narrative themes such as meaning making, agency and communion, generativity, and loss and redemption allows for a finely grained analysis of common and more atypical sources of suffering, their resolution, and the acceptance of their continuation when resolution is not possible. Such analysis, used in professional work and in higher education, can enrich one’s empathy and one’s sense of both the fragility and strength of everyday life.

Keywords: meaning making, narrative theory, suffering, teaching

Procedia PDF Downloads 243
1245 An Optimal Matching Design Method of Space-Based Optical Payload for Typical Aerial Target Detection

Authors: Yin Zhang, Kai Qiao, Xiyang Zhi, Jinnan Gong, Jianming Hu

Abstract:

In order to effectively detect aerial targets over long distances, an optimal matching design method of space-based optical payload is proposed. Firstly, main factors affecting optical detectability of small targets under complex environment are analyzed based on the full link of a detection system, including band center, band width and spatial resolution. Then a performance characterization model representing the relationship between image signal-to-noise ratio (SCR) and the above influencing factors is established to describe a detection system. Finally, an optimal matching design example is demonstrated for a typical aerial target by simulating and analyzing its SCR under different scene clutter coupling with multi-scale characteristics, and the optimized detection band and spatial resolution are presented. The method can provide theoretical basis and scientific guidance for space-based detection system design, payload specification demonstration and information processing algorithm optimization.

Keywords: space-based detection, aerial targets, optical system design, detectability characterization

Procedia PDF Downloads 142
1244 Monetary Evaluation of Dispatching Decisions in Consideration of Choice of Transport

Authors: Marcel Schneider, Nils Nießen

Abstract:

Microscopic simulation programs enable the description of the two processes of railway operation and the previous timetabling. Occupation conflicts are often solved based on defined train priorities on both process levels. These conflict resolutions produce knock-on delays for the involved trains. The sum of knock-on delays is commonly used to evaluate the quality of railway operations. It is either compared to an acceptable level-of-service or the delays are evaluated economically by linearly monetary functions. It is impossible to properly evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for evaluation of dispatching decisions. It uses models of choice of transport and considers the behaviour of the end-costumers. These models evaluate the knock-on delays in more detail than linearly monetary functions and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operation with the macroscopic model of choice of transport. First it will be implemented for the railway operations process, but it can also be used for timetabling. The evaluation considers the possibility to change over to other transport modes by the end-costumers. The new approach first looks at the rail-mounted and road transport, but it can also be extended to air transport. The split of the end-costumers is described by the modal-split. The reactions by the end-costumers have an effect on the revenues of the railway undertakings. Various travel purposes has different pavement reserves and tolerances towards delays. Longer journey times affect besides revenue changes also additional costs. The costs depend either on time or track and arise from circulation of workers and vehicles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of the delays. The contribution margin is calculated for different resolution decisions of the same conflict. The conflict resolution is improved until the monetary loss becomes minimised. The iterative process therefore determines an optimum conflict resolution by observing the change of the contribution margin. Furthermore, a monetary value of each dispatching decision can also be determined.

Keywords: choice of transport, knock-on delays, monetary evaluation, railway operations

Procedia PDF Downloads 299
1243 Identifying Reforms Required in Construction Contracts from Resolved Disputed Cases

Authors: K. C. Iyer, Yogita Manan Bindal, Sumit Kumar Bakshi

Abstract:

The construction industry is plagued with disputes and litigation in India with many stalled projects seeking dispute resolution. This has an adverse effect on the performance and overall project delivery and impacts future investments within the industry. While construction industry is the major driver of growth, there has not been major reforms in the government construction contracts. The study is aimed at identifying the proactive means of dispute avoidance, focusing on reforms required within the construction contracts, by studying 49 arbitration awards of construction disputes. The claims presented in the awards are aggregated to study the causes linked to the contract document and are referred against the prospective recommendation and practices as surveyed from literature review of research papers. Within contract administration, record keeping has been a major concern as they are required by the parties to substantiate the claims or the counterclaims and therefore are essential in any dispute redressal process. The study also observes that the right judgment is inhibited when the record keeping is improper and due to lack of coherence between documents, the dispute resolution period is also prolonged. The finding of the research will be relevant to industry practitioners in contract drafting with a view to avoid disputes.

Keywords: construction contract, contract administration, contract management, dispute avoidance

Procedia PDF Downloads 235
1242 A Predictive Model for Turbulence Evolution and Mixing Using Machine Learning

Authors: Yuhang Wang, Jorg Schluter, Sergiy Shelyag

Abstract:

The high cost associated with high-resolution computational fluid dynamics (CFD) is one of the main challenges that inhibit the design, development, and optimisation of new combustion systems adapted for renewable fuels. In this study, we propose a physics-guided CNN-based model to predict turbulence evolution and mixing without requiring a traditional CFD solver. The model architecture is built upon U-Net and the inception module, while a physics-guided loss function is designed by introducing two additional physical constraints to allow for the conservation of both mass and pressure over the entire predicted flow fields. Then, the model is trained on the Large Eddy Simulation (LES) results of a natural turbulent mixing layer with two different Reynolds number cases (Re = 3000 and 30000). As a result, the model prediction shows an excellent agreement with the corresponding CFD solutions in terms of both spatial distributions and temporal evolution of turbulent mixing. Such promising model prediction performance opens up the possibilities of doing accurate high-resolution manifold-based combustion simulations at a low computational cost for accelerating the iterative design process of new combustion systems.

Keywords: computational fluid dynamics, turbulence, machine learning, combustion modelling

Procedia PDF Downloads 49
1241 High-Resolution Facial Electromyography in Freely Behaving Humans

Authors: Lilah Inzelberg, David Rand, Stanislav Steinberg, Moshe David Pur, Yael Hanein

Abstract:

Human facial expressions carry important psychological and neurological information. Facial expressions involve the co-activation of diverse muscles. They depend strongly on personal affective interpretation and on social context and vary between spontaneous and voluntary activations. Smiling, as a special case, is among the most complex facial emotional expressions, involving no fewer than 7 different unilateral muscles. Despite their ubiquitous nature, smiles remain an elusive and debated topic. Smiles are associated with happiness and greeting on one hand and anger or disgust-masking on the other. Accordingly, while high-resolution recording of muscle activation patterns, in a non-interfering setting, offers exciting opportunities, it remains an unmet challenge, as contemporary surface facial electromyography (EMG) methodologies are cumbersome, restricted to the laboratory settings, and are limited in time and resolution. Here we present a wearable and non-invasive method for objective mapping of facial muscle activation and demonstrate its application in a natural setting. The technology is based on a recently developed dry and soft electrode array, specially designed for surface facial EMG technique. Eighteen healthy volunteers (31.58 ± 3.41 years, 13 females), participated in the study. Surface EMG arrays were adhered to participant left and right cheeks. Participants were instructed to imitate three facial expressions: closing the eyes, wrinkling the nose and smiling voluntary and to watch a funny video while their EMG signal is recorded. We focused on muscles associated with 'enjoyment', 'social' and 'masked' smiles; three categories with distinct social meanings. We developed a customized independent component analysis algorithm to construct the desired facial musculature mapping. First, identification of the Orbicularis oculi and the Levator labii superioris muscles was demonstrated from voluntary expressions. Second, recordings of voluntary and spontaneous smiles were used to locate the Zygomaticus major muscle activated in Duchenne and non-Duchenne smiles. Finally, recording with a wireless device in an unmodified natural work setting revealed expressions of neutral, positive and negative emotions in face-to-face interaction. The algorithm outlined here identifies the activation sources in a subject-specific manner, insensitive to electrode placement and anatomical diversity. Our high-resolution and cross-talk free mapping performances, along with excellent user convenience, open new opportunities for affective processing and objective evaluation of facial expressivity, objective psychological and neurological assessment as well as gaming, virtual reality, bio-feedback and brain-machine interface applications.

Keywords: affective expressions, affective processing, facial EMG, high-resolution electromyography, independent component analysis, wireless electrodes

Procedia PDF Downloads 216
1240 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 327
1239 Extraction of Road Edge Lines from High-Resolution Remote Sensing Images Based on Energy Function and Snake Model

Authors: Zuoji Huang, Haiming Qian, Chunlin Wang, Jinyan Sun, Nan Xu

Abstract:

In this paper, the strategy to extract double road edge lines from acquired road stripe image was explored. The workflow is as follows: the road stripes are acquired by probabilistic boosting tree algorithm and morphological algorithm immediately, and road centerlines are detected by thinning algorithm, so the initial road edge lines can be acquired along the road centerlines. Then we refine the results with big variation of local curvature of centerlines. Specifically, the energy function of edge line is constructed by gradient feature and spectral information, and Dijkstra algorithm is used to optimize the initial road edge lines. The Snake model is constructed to solve the fracture problem of intersection, and the discrete dynamic programming algorithm is used to solve the model. After that, we could get the final road network. Experiment results show that the strategy proposed in this paper can be used to extract the continuous and smooth road edge lines from high-resolution remote sensing images with an accuracy of 88% in our study area.

Keywords: road edge lines extraction, energy function, intersection fracture, Snake model

Procedia PDF Downloads 315
1238 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 90
1237 Lab Bench for Synthetic Aperture Radar Imaging System

Authors: Karthiyayini Nagarajan, P. V. Ramakrishna

Abstract:

Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar (SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System (Lab Bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.

Keywords: synthetic aperture radar, radio reflection model, lab bench, imaging engineering

Procedia PDF Downloads 459