Abnormal Behaviour of UI on tapping multiple UIComponents i.e button, views, collection view and table view cell
I have an iOS application designed programatically using autolayout in swift uikit,
it’s more than 400 screens application with 200+ generic UIComponents, the issue is that when i tap on a button and a view or button and colleciton view cell simultaneously in short if more than one UIComponent the UI behaves abnormally i have custom expandable views with run time constraints so when such views and continue or back buton are tapped simultaneously the expansion and naviagtion both occur and when navigate back to that controller the UI is blank or abnormal
i have tried different approaches
using time stamp, swzilling effect , creating a tocuh manger but no fruit ful result
i want a generic solution which requires minimal chaanges nad what i wan tto achieve is that if more than one ui component is tapped both events are ignored or first one is executed and rest are ignored
i cannot do changes from screen to screen as it extensive application
provide some generic solution
Get high res icon image as UIImage in iOS 18
Before iOS 18, I have the following code that works well:
Does AppDelegate’s continueUserActivity’s return value even used for deep link purpose? Why doesn’t SceneDelegate have it?
I understand that for AppDelegate’s continueUserActivity API, we need to return YES if it’s handled by app, and no otherwise, as discussed on the official API doc. However, I don’t observe any behavioral difference between returning YES/NO.
UILabel.attributedText with many line breaks causes scroll lag in UITableView
I am creating a UI similar to Twitter’s timeline using UITableView
.
The UILabel
on the UITableViewCell
has attributedText
for displaying content.
Bottom sheet main view white background does not take all height and width
I’m trying to present a custom bottom sheet with a .custom modalPresentationStyle from another VC.
I implemented this custom presentation controller:
UIKit How to capture the time it takes to render first frame
I’m implementing telemetry for our app, one of the things I’m trying to capture are the distinct blocks of time as described by Apple engineers in the Optimizing App Launch video of WWDC 2019. I’m having a hard time capturing precisely the time for the First frame and the extended stages.
Is it possible to receive touching events when subview is outside of superview without using hitTest methods?
I have a video player view controller (I’m gonna call it PlayerVC
) that consists of an AVPlayer
and a custom timeline view. In one of my features I can add 2 PlayerVC
s to a super view controller as their child view controllers.